The latest news from Google AI.

At its re: Mars conference, Amazon’s CEO Jeff Bezos took the stage today to be “interviewed” by Jenny Freshwater, Amazon’s director of forecasting. As any AWS machine learning tool could have forecasted, having an employee interview her boss didn’t lead to any challenging questions or especially illuminating answers, but Bezos did get a chance to talk about a variety of topics, ranging from business advice to his plans for Blue Origin.
We can safely ignore the business advice, given that Amazon’s principle of “disagree and commit” is about as well known as it could be, but his comments about Blue Origin, his plans for moon exploration and its relationship to startups were quite interesting.
He noted that we now know so much more about the moon than ever before, including that it does provide a number of resources that make it a good base for further space exploration. “The reason we need to go to space is to save the Earth,” he said. “We are going to grow this civilization — and I’m talking about something that our grandchildren will work on — and their grandchildren. This isn’t something that this generation is able to accomplish. But we need to move heavy industry off Earth.”
Amazon said Wednesday it expects to begin large-scale deliveries by drone in the coming months as it unveiled its newest design for its “Prime Air” fleet.
Jeff Wilke, head of Amazon’s consumer operations, told the company’s Machine Learning, Automation, Robotics and Space conference in Las Vegas that drones would play a role in ramping up efforts to shorten delivery times for many items to just one day for Amazon Prime members.
“We’ve been hard at work building fully electric drones that can fly up to 15 miles (25 kilometers) and deliver packages under five pounds (2.3 kilos) to customers in less than 30 minutes,” Wilke said in a blog post.
Self-driving cars could revolutionise people’s lives. By the end of the next decade, or perhaps even sooner, they could radically transform public spaces and liberate us from the many problems of mass car ownership. They’ll also be much better behaved than human drivers.
Robot drivers won’t break the speed limit, jump the lights, or park where they shouldn’t. They won’t drive under the influence of drink or drugs. They’ll never get tired or behave aggressively. They won’t be distracted by changing the music or sending a text, and they’ll never be trying to impress their mates.
Driverless cars could also change the face of public spaces. Private cars are very expensive items that do absolutely nothing 95% of the time. They are economically viable only because paying a taxi driver for all your car journeys would be even more expensive. Once cars don’t need human drivers, this cost balance should tip the other way.
When Elon Musk and DARPA both hop aboard the cyborg hypetrain, you know brain-machine interfaces (BMIs) are about to achieve the impossible.
BMIs, already the stuff of science fiction, facilitate crosstalk between biological wetware with external computers, turning human users into literal cyborgs. Yet mind-controlled robotic arms, microelectrode “nerve patches”, or “memory Band-AIDS” are still purely experimental medical treatments for those with nervous system impairments.
With the Next-Generation Nonsurgical Neurotechnology (N3) program, DARPA is looking to expand BMIs to the military. This month, the project tapped six academic teams to engineer radically different BMIs to hook up machines to the brains of able-bodied soldiers. The goal is to ditch surgery altogether—while minimizing any biological interventions—to link up brain and machine.
Robert Downey Jr. doesn’t pretend to be a brilliant scientist — even though he’s played Tony Stark, aka Iron Man, for the past 11 years.
But on Tuesday night he attended Amazon’s brand new, premier, open-to-the-public Machine Learning, Automation, Robotics and Space (re: MARS) conference in Las Vegas — a room filled with AI legends, astronauts, and other dignitaries — as a keynote speaker.
He delivered a gag-filled talk that somehow weaved together the history of the Marvel Cinematic Universe, the evolution of Stark’s Iron Man suits, allusions to his own troubled history with drug addiction, the actual history of artificial intelligence and its pioneers, with a bunch of jokes using the Amazon Alexa voice and Matt Damon (including a videotaped guest appearance by Damon).
Researchers, from biochemists to material scientists, have long relied on the rich variety of organic molecules to solve pressing challenges. Some molecules may be useful in treating diseases, others for lighting our digital displays, still others for pigments, paints, and plastics. The unique properties of each molecule are determined by its structure—that is, by the connectivity of its constituent atoms. Once a promising structure is identified, there remains the difficult task of making the targeted molecule through a sequence of chemical reactions. But which ones?
Organic chemists generally work backwards from the target molecule to the starting materials using a process called retrosynthetic analysis. During this process, the chemist faces a series of complex and inter-related decisions. For instance, of the tens of thousands of different chemical reactions, which one should you choose to create the target molecule? Once that decision is made, you may find yourself with multiple reactant molecules needed for the reaction. If these molecules are not available to purchase, then how do you select the appropriate reactions to produce them? Intelligently choosing what to do at each step of this process is critical in navigating the huge number of possible paths.
Researchers at Columbia Engineering have developed a new technique based on reinforcement learning that trains a neural network model to correctly select the “best” reaction at each step of the retrosynthetic process. This form of AI provides a framework for researchers to design chemical syntheses that optimize user specified objectives such synthesis cost, safety, and sustainability. The new approach, published May 31 by ACS Central Science, is more successful (by ~60%) than existing strategies for solving this challenging search problem.