Toggle light / dark theme

I agree. Look at Australia or Canada as well as Israel or other companies rising up across Asia. In the next few years, Australia, China, and Israel will be key areas that folks should pay attention to as part of the “vNext Tech Valley” standard. Granted Silicon Valley will still be a leader; however, these other areas will be closing that gap.


Tech.eu contributor Jennifer Baker caught up with Ken Gabriel at the EIT Innovation Forum to talk about the difference between EU and US startups.

Read more

00_naut

“These cables, whilst stylish, still put a large emphasis on practicality – having been crafted from durable, braided nylon designed to withstand wear and tear. The range also goes further, the company professes, by solving everyday problems such as ‘forgetting your cable, running out of battery on-the-go, or straining to use your device while charging’.”

Read more

Now, that’s an exhibit!


May 5, 2016, will mark the opening of a new and exciting exhibit at Chicago’s famed Museum of Science and Industry: an in-depth and interactive look behind the curtain at the Defense Advanced Research Projects Agency (DARPA).

DARPA was created in 1958 at the peak of the Cold War in response to the Soviet Union’s launch of Sputnik, the world’s first manmade satellite, which passed menacingly over the United States every 96 minutes. Tasked with preventing such strategic surprises in the future, the agency has achieved its mission over the years in part by creating a series of technological surprises of its own, many of which are highlighted in the Chicago exhibit, “Redefining Possible.”

“We are grateful to Chicago’s Museum of Science and Industry for inviting us to tell the DARPA story of ambitious problem solving and technological innovation,” said DARPA Deputy Director Steve Walker, who will be on hand for the exhibit’s opening day. “Learning how DARPA has tackled some of the most daunting scientific and engineering challenges—and how it has tolerated the risk of failure in order to have major impact when it succeeds—can be enormously inspiring to students. And for adults, we hope the exhibit will serve as a reminder that some of the most exciting work going on today in fields as diverse as chemistry, engineering, cyber defense and synthetic biology are happening with federal support, in furtherance of pressing national priorities.”

Read more

I do love Nvidia!


During the past nine months, an Nvidia engineering team built a self-driving car with one camera, one Drive-PX embedded computer and only 72 hours of training data. Nvidia published an academic preprint of the results of the DAVE2 project entitled End to End Learning for Self-Driving Cars on arXiv.org hosted by the Cornell Research Library.

The Nvidia project called DAVE2 is named after a 10-year-old Defense Advanced Research Projects Agency (DARPA) project known as DARPA Autonomous Vehicle (DAVE). Although neural networks and autonomous vehicles seem like a just-invented-now technology, researchers such as Google’s Geoffrey Hinton, Facebook’s Yann Lecune and the University of Montreal’s Yoshua Bengio have collaboratively researched this branch of artificial intelligence for more than two decades. And the DARPA DAVE project application of neural network-based autonomous vehicles was preceded by the ALVINN project developed at Carnegie Mellon in 1989. What has changed is GPUs have made building on their research economically feasible.

Neural networks and image recognition applications such as self-driving cars have exploded recently for two reasons. First, Graphical Processing Units (GPU) used to render graphics in mobile phones became powerful and inexpensive. GPUs densely packed onto board-level supercomputers are very good at solving massively parallel neural network problems and are inexpensive enough for every AI researcher and software developer to buy. Second, large, labeled image datasets have become available to train massively parallel neural networks implemented on GPUs to see and perceive the world of objects captured by cameras.

Closing the instability gap.


(Phys.org)—It might be said that the most difficult part of building a quantum computer is not figuring out how to make it compute, but rather finding a way to deal with all of the errors that it inevitably makes. Errors arise because of the constant interaction between the qubits and their environment, which can result in photon loss, which in turn causes the qubits to randomly flip to an incorrect state.

In order to flip the qubits back to their correct states, physicists have been developing an assortment of quantum techniques. Most of them work by repeatedly making measurements on the system to detect errors and then correct the errors before they can proliferate. These approaches typically have a very large overhead, where a large portion of the computing power goes to correcting errors.

In a new paper published in Physical Review Letters, Eliot Kapit, an assistant professor of physics at Tulane University in New Orleans, has proposed a different approach to quantum error correction. His method takes advantage of a recently discovered unexpected benefit of quantum noise: when carefully tuned, quantum noise can actually protect qubits against unwanted noise. Rather than actively measuring the system, the new method passively and autonomously suppresses and corrects errors, using relatively simple devices and relatively little computing power.

Due to the pace of Quantum Computing is developing; NIST is rushing to create a Quantum proof cryptographic algorithms to prevent QC hacking. Like I have stated, I believe we’re now less that 7 years away for QC being in many mainstream devices, infrastructure, etc. And, China and it’s partnership with Australia; the race is now on and hotter than ever.


The National Institute for Standards and Technology has begun to look into quantum cybersecurity, according to a new report that details and plans out ways scientists could protect these futuristic computers.

April 29, 2016.

Ransomware has taken off in 2016, already eclipsing the number of attacks observed in a recently published threat report from Symantec.

Post-quantum cryptography discussion in Tacoma WA on May 5th discussing hacking by QC hackers and leveraging Cryptography algorithms to offset the attacks; may be of interest to sit in and even join in the debates. I will try attend if I can because it would be interesting to see the arguments raised and see the responses.


The University of Washington Tacoma Institute of Technology will present a discussion about the esoteric field of post-quantum cryptography at the Northwest Cybersecurity Symposium on May 5.

“I’ve been researching post-quantum cryptography for years, finding ways to protect against a threat that doesn’t yet exist,” said Anderson Nascimento, assistant professor of computer science at the institute, in a release.

Post-quantum cryptography refers to encryption that would be secure against an attack by a quantum computer — a kind of supercomputer using quantum mechanics, which, so far, exists only in theory.

Excellent read and a true point about the need for some additional data laws with our ever exploding information overload world.


Laws for Mobility, IoT, Artificial Intelligence and Intelligent Process Automation

If you are the VP of Sales, it is quite likely you want and need to know up to date sales numbers, pipeline status and forecasts. If you are meeting with a prospect to close a deal, it is quite likely that having up to date business intelligence and CRM information would be useful. Likewise traveling to a remote job site to check on the progress of an engineering project is also an obvious trigger that you will need the latest project information. Developing solutions integrated with mobile applications that can anticipate your needs based upon your Code Halo data, the information that surrounds people, organizations, projects, activities and devices, and acting upon it automatically is where a large amount of productivity gains will be found in the future.

There needs to be a law, like Moore’s infamous law, that states, “The more data that is collected and analyzed, the greater the economic value it has in aggregate.” This law I believe is accurate and my colleagues at the Center for the Future of Work, wrote a book titled Code Halos that documents evidence of its truthfulness as well. I would also like to submit an additional law, “Data has a shelf-life and the economic value of data diminishes over time.” In other words, if I am negotiating a deal today, but can’t get the critical business data I need for another week, the data will not be as valuable to me then. The same is true if I am trying to optimize, in real-time, the schedules of 5,000 service techs, but don’t have up to date job status information. Receiving job status information tomorrow, does not help me optimize schedules today.

Read more