What might life be like once autonomous vehicles populate the roads? With the help of colleague Timothy Bonds, RAND’s Nidhi Kalra described what may occur when autonomous vehicles “democratize transportation.” Read our recap from #PoliticsAside: r.rand.org/326y
When you see a photo of a dog bounding across the lawn, it’s pretty easy for us humans to imagine how the following moments played out. Well, scientists at MIT have just trained machines to do the same thing, with artificial intelligence software that can take a single image and use it to to create a short video of the seconds that followed. The technology is still bare-bones, but could one day make for smarter self-driving cars that are better prepared for the unexpected, among other applications.
The software uses a deep-learning algorithm that was trained on two million unlabeled videos amounting to a year’s worth of screen time. It actually consists of two separate neural networks that compete with one another. The first has been taught to separate the foreground and the background and to identify the object in the image, which allows the model to then determine what is moving and what isn’t.
According to the scientists, this approach improves on other computer vision technologies under development that can also create video of the future. These involve taking the information available in existing videos and stretching them out with computer-generated vision, by building each frame one at a time. The new software is claimed to be more accurate, by producing up to 32 frames per second and building out entire scenes in one go.
Since the Large Hadron Collider (LHC) needs to be in tip-top shape to discover new particles, it has two inspectors making sure everything’s in working order. The two of them are called TIM, short not for Timothy, but for Train Inspection Monorail. These mini autonomous monorails that keep an eye on the world’s largest particle collider follow a pre-defined route and get around using tracks suspended from the ceiling. According to CERN’s post introducing the machines, the tracks are remnants from the time the tunnel housed the Large Electron Positron instead of the LHC. The LEP’s monorail was bigger, but not quite as high-tech: it was mainly used to transport materials and workers.
As for what the machines can do, the answer is “quite a few.” They can monitor the tunnel’s structure, oxygen percentage, temperature and communication bandwidth in real time. Both TIMs can also take visual and infrared images, as well as pull small wagons behind them if needed. You can watch them in action below — as you can see, they’re not much to look at with their boxy silver appearance. But without them, it’ll be tough monitoring a massive circular tunnel with a 17-mile circumference.
We took the technology out of the studio and into a car – making Holoportation truly mobile. To accomplish this, we reduced the bandwidth requirements by 97%, while still maintaining quality. This new mobile Holoportation system greatly increases the potential applications of real-time 3D capture and transmission.
Nice update and glad the author mentioned Airbus, Gooch and Housego as I often see these 2 contributors missed in QC roadmap and companies engaged on QC activities. Airbus has been heavily involved with QC research and development for a few years now.
Physicsworld.com — news, views and information for the global physics community from Institute of Physics Publishing.
A lot has happened since we first heard about the AirMule, a prototype VTOL (Vertical Takeoff and Landing) aircraft that features internal rotor blades – these work along with the horizontal-thrust ducted fans visible at the rear. First, it made some tethered autonomous test flights. Then, it flew untethered for a short distance. Now, known as the Cormorant UAV, it’s made its first full untethered autonomous flight … although there were a couple of hiccups.
Designed by Israeli firm Tactical Robotics, the Cormorant is designed to deliver troops, civilian passengers or other cargo within tight quarters where helicopters with exposed rotor blades just can’t go. With the UAV in its name standing for Unmanned Aerial Vehicle, the idea is that it will perform these tasks either autonomously or by remote control.
The latest test took place in Israel on Nov. 3rd, lasting only about two minutes and involving low flight over uneven terrain. While the aircraft did successfully demonstrate autonomous flight modes such as takeoff, climb, acceleration, cruise, deceleration, descent, turns, hover and touchdown, it is hoped that subsequent flights will be able to smooth out the transitions between those modes.
NRG-X is the world’s first fully automatic charging solution, which provides efficient, high power energy transfer, has a great range of parking tolerance and can be simply retrofitted to almost every electric vehicle. Therefore, NRG-X is the ideal solution for convenient every day charging of your electric vehicle primarily at home. The system comprises two basic components:
Artificial intelligence (AI) is all the rage these days. However, people often overlook the fact that it’s a truly ancient vogue. I can’t think of another current high-tech mania whose hype curve got going during the days when Ike was in the White House, “I Love Lucy” was on the small screen, and programming in assembly language was state of the art.
As AI’s adoption grows, we run the risk of belittling the technology’s potential if we continue to fixate on the notion that it’s “artificial.” When you think of it, all technologies are artificial, pretty much by definition. Cars are artificial transportation, houses are artificial shelters, and so on.
SALT LAKE CITY, Nov. 14, 2016 /PRNewswire/ IBM (NYSE: IBM) and NVIDIA (NVDA)today announced collaboration on a new deep learning tool optimized for the latest IBM and NVIDIA technologies to help train computers to think and learn in more human-like ways at a faster pace.
Deep learning is a fast growing machine learning method that extracts information by crunching through millions of pieces of data to detect and rank the most important aspects from the data. Publicly supported among leading consumer web and mobile application companies, deep learning is quickly being adopted by more traditional business enterprises.
Deep learning and other artificial intelligence capabilities are being used across a wide range of industry sectors; in banking to advance fraud detection through facial recognition; in automotive for self-driving automobiles and in retail for fully automated call centers with computers that can better understand speech and answer questions.
No autonomous cars, planes, ships, weapons (not sure I would even still want these), and other robots for me until we have our Net and other infrastructure replaced with QC.
It seems that all of Silicon Valley is designing artificial intelligence for driverless cars. But before we hand over our driving to computers, Charlie Miller, a well-known computer security researcher, would like car companies to pay attention to security.
Miller, who is a security engineer at Uber’s advanced technology center, spent a few years looking into the security of automobiles. And what he found didn’t impress him. He and his friend Chris Valasek hacked a Jeep remotely in 2014, and, after a series of denials from the car company, Chrysler had to announce a recall of 1.4 million vehicles. Miller gave a scary and hilarious talk at the recent ARM TechCon event in Santa Clara, Calif.