Toggle light / dark theme

Autonomous vehicles might someday be able to navigate bustling city streets to deliver groceries, pizzas, and other packages without a human behind the wheel. But that doesn’t solve what Ford Motor CTO Ken Washington describes as the last 50-foot problem.

Ford and startup Agility Robotics are partnering in a research project that will test how two-legged robots and self-driving vehicles can work together to solve that curb-to-door problem. Agility’s Digit, a two-legged robot that has a lidar where its head should be, will be used in the project. The robot, which is capable of lifting 40 pounds, can ride along in a self-driving vehicle and be deployed when needed to delivery packages.

“We’re looking at the opportunity of autonomous vehicles through the lens of the consumer and we know from some early experimentation that there are challenges with the last 50 feet,” Washington told TechCrunch in a recent interview. Finding a solution could be an important differentiator for Ford’s commercial robotaxi service, which it plans to launch in 2021.

Read more

SpotMini autonomously navigates a specified route through an office and lab facility. Before the test, the robot is manually driven through the space so it can build a map of the space using visual data from cameras mounted on the front, back and sides of the robot. During the autonomous run, SpotMini uses data from the cameras to localize itself in the map and to detect and avoid obstacles. Once the operator presses ‘GO’ at the beginning of the video, the robot is on its own. Total walk time for this route is just over 6 minutes. (The QR codes visible in the video are used to measure performance, not for navigation.)

Read more

The work should lead to control one to a few hundred atoms at microsecond timescales using AI control of electron beams. The computational/analytical framework developed in this work are general and can further help develop techniques for controlling single-atom dynamics in 3D materials, and ultimately, upscaling manipulations of multiple atoms to assemble 1 to 1000 atoms with high speed and efficacy.

Scientists at MIT, the University of Vienna, and several other institutions have taken a step toward developing a method that can reposition atoms with a highly focused electron beam and control their exact location and bonding orientation. The finding could ultimately lead to new ways of making quantum computing devices or sensors, and usher in a new age of “atomic engineering,” they say.

This could help make quantum sensors and computers.

Read more

Robot bees are no replacement for our vital pollinators here on Earth. Up on the International Space Station, however, robots bearing the bee name could help spacefaring humans save precious time.

On Friday, NASA astronaut Anne McClain took one of the trio of Astrobees out for a spin. Bumble and its companion Honey both arrived on the ISS a month ago, and are currently going through a series of checks. Bumble passed the first hurdle when McClain manually flew it around the Japanese Experiment Module. Bumble took photos of the module which will be used to make a map for all the Astrobees, guiding them as they begin their tests there.

The three cube-shaped robots (Queen will arrive from Earth in the SpaceX resupply mission this July) don’t look anything like their namesakes, but they are non-threatening by design, says Astrobee project manager Maria Bualat. Since they’re built to fly around autonomously, doing tasks for the crew of the International Space Station, “one of our hardest problems is actually dealing with safety concerns,” she says.

Read more

Putting their own twist on robots that amble through complicated landscapes, the Stanford Student Robotics club’s Extreme Mobility team has developed a four-legged robot that is not only capable of performing acrobatic tricks and traversing challenging terrain but is also designed with reproducibility in mind. Anyone who wants their own version of the robot, dubbed Stanford Doggo, can consult comprehensive plans, code and a supply list that the students have made freely available online.

“We had seen these other quadruped robots used in research, but they weren’t something that you could bring into your own lab and use for your own projects,” said Nathan Kau, ‘20, a major and lead for Extreme Mobility. “We wanted Stanford Doggo to be this open source that you could build yourself on a relatively small budget.”

Whereas other similar robots can cost tens or hundreds of thousands of dollars and require customized parts, the Extreme Mobility students estimate the cost of Stanford Doggo at less than $3,000—including manufacturing and shipping costs—and nearly all the components can be bought as-is online. They hope the accessibility of these resources inspires a community of Stanford Doggo makers and researchers who develop innovative and meaningful spinoffs from their work.

Read more

An algorithm developed by Brown University computer scientists enables robots to put pen to paper, writing words using stroke patterns similar to human handwriting. It’s a step, the researchers say, toward robots that are able to communicate more fluently with human co-workers and collaborators.

“Just by looking at a target image of a word or sketch, the robot can reproduce each stroke as one continuous action,” said Atsunobu Kotani, an undergraduate student at Brown who led the algorithm’s development. “That makes it hard for people to distinguish if it was written by the robot or actually written by a human.”

The algorithm makes use of deep learning networks that analyze images of handwritten words or sketches and can deduce the likely series of pen strokes that created them. The robot can then reproduce the words or sketches using the pen strokes it learned. In a paper to be presented at this month’s International Conference on Robotics and Automation, the researchers demonstrate a robot that was able to write “hello” in 10 languages that employ different character sets. The robot was also able to reproduce rough sketches, including one of the Mona Lisa.

Read more

There are about half a dozen other technological approaches to quantum computing vying for preeminence these days. The ion trap method differs from the most popular approach—the silicon chip-based “superconducting qubit”—preferred by the likes of IBM, Google, Intel, and other tech giants. Honeywell, the industrial conglomerate, is one of the few companies pursuing the ion trap approach along with IonQ.

“Quantum computers can potentially solve many of the problems we have today,” Chapman told Fortune on a call. He listed off potential areas of impact, such as drug discovery, energy, logistics, materials science, and A.I. techniques. “How would you not want to be part of that?”

“This is a once-in-a-generation type opportunity,” said Andrew Schoen, a principal at New Enterprise Associates, IonQ’s first backer. “We view this as a chance to build the next Intel.”

Read more