Microsoft’s (NASDAQ:MSFT) AI program XiaoIce, which was being tested on Chinese social media sites has shown positive results, a positive for Microsoft.
Category: robotics/AI – Page 2412
Around the world, cities are choking on smog. But a new AI system plans to analyze just how bad the situation is by aggregating data from smartphone pictures captured far and wide across cities.
The project, called AirTick, has been developed by researchers from Nanyang Technological University in Singapore, reports New Scientist. The reasoning is pretty simple: Deploying air sensors isn’t cheap and takes a long time, so why not make use of the sensors that everyone has in their pocket?
The result is an app which allows people to report smog levels by uploading an image tagged with time and location. Then, a machine learning algorithm chews through the data and compares it against official air-quality measurements where it can. Over time, the team hopes the software will slowly be able to predict air quality from smartphone images alone.
Me and one of my friends on LinkedIn both knew it was only a matter of time that AI & Quantum together would be announced. And, Google with D-Wave indeed would be leading this charge. BTW — once this pairing of technologies is done; get ready for some amazing AI technology including robotics to come out.
But there may not be any competitors for a while if Google’s “Ace of Spades” newbie performs as they predict. According to Hartmut Neven, head of its Quantum Al Lab, this baby can run:
“We found that for problem instances involving nearly 1,000 binary variables, quantum annealing significantly outperforms its classical counterpart, simulated annealing. It is more than 10 to the power of 8 times faster than simulated annealing running on a single core.”
In layperson’s lingo: this sucker will run 100 million times faster than the clunker on your desk. Problem is, it may not be on the production line for a while.
Carnegie Mellon University is embarking on a five-year, $12 million research effort to reverse-engineer the brain and “make computers think more like humans,” funded by the U.S. Intelligence Advanced Research Projects Activity (IARPA). The research is led by Tai Sing Lee, a professor in the Computer Science Department and the Center for the Neural Basis of Cognition (CNBC).
The research effort, through IARPA’s Machine Intelligence from Cortical Networks (MICrONS) research program, is part of the U.S. BRAIN Initiative to revolutionize the understanding of the human brain.
A “Human Genome Project” for the brain’s visual system
“MICrONS is similar in design and scope to the Human Genome Project, which first sequenced and mapped all human genes,” Lee said. “Its impact will likely be long-lasting and promises to be a game changer in neuroscience and artificial intelligence.”
Bon Appétit — could 3D Printers be coming to make your next 1st Class Meal on a flight, or in resturants with robot servers? The US Army believes it is the new way for them.
Beats the heck out of MREs.
As I said this morning; there is something definitely going with Quantum today. Maybe it’s the planet alignment (I saw there was something going on with the alignment with Aquaris today) — this is awesome news.
Rigetti Computing is working on designs for quantum-powered chips to perform previously impossible feats that advance chemistry and machine learning.
“Full exploitation of this information is a major challenge,” officials with the Defense Advanced Research Projects Agency (DARPA) wrote in a 2009 brief on “deep learning.”
“Human observation and analysis of [intelligence, surveillance and reconnaissance] assets is essential, but the training of humans is both expensive and time-consuming. Human performance also varies due to individuals’ capabilities and training, fatigue, boredom, and human attentional capacity.”
Working with a team of researchers at MIT, DARPA is hoping to take all of that human know-how and shrink it down into processing unit no bigger than your cellphone, using a microchip known as “Eyeriss.” The concept relies on “neural networks;” computerized memory networks based on the workings of the human brain.
MIT researchers have presented the Eyeriss chips that have 10 times more power than mobile GPUs and use deep learning for local AI functions.
We can safely say that the position as a make up artist is safe from AI for now.
A mini factory-style robot arm tries its mechanical hand at applying makeup. The results aren’t pretty.
More news on DARPA’s new deep learning microchip for the military.
A military-funded breakthrough in microchips opens the door to portable deep learning.