Toggle light / dark theme

Robots activated by water may be the next frontier

New research from the laboratory of Ozgur Sahin, associate professor of biological sciences and physics at Columbia University, shows that materials can be fabricated to create soft actuators—devices that convert energy into physical motion—that are strong and flexible, and, most important, resistant to water damage.

“There’s a growing trend of making anything we interact with and touch from materials that are dynamic and responsive to the environment,” Sahin says. “We found a way to develop a material that is water-resistant yet, at the same time, equipped to harness water to deliver the force and motion needed to actuate .”

The research was published online May 21 in Advanced Materials Technologies.

Read more

In Ford’s future, two-legged robots and self-driving cars could team up on deliveries

Autonomous vehicles might someday be able to navigate bustling city streets to deliver groceries, pizzas, and other packages without a human behind the wheel. But that doesn’t solve what Ford Motor CTO Ken Washington describes as the last 50-foot problem.

Ford and startup Agility Robotics are partnering in a research project that will test how two-legged robots and self-driving vehicles can work together to solve that curb-to-door problem. Agility’s Digit, a two-legged robot that has a lidar where its head should be, will be used in the project. The robot, which is capable of lifting 40 pounds, can ride along in a self-driving vehicle and be deployed when needed to delivery packages.

“We’re looking at the opportunity of autonomous vehicles through the lens of the consumer and we know from some early experimentation that there are challenges with the last 50 feet,” Washington told TechCrunch in a recent interview. Finding a solution could be an important differentiator for Ford’s commercial robotaxi service, which it plans to launch in 2021.

Read more

SpotMini Autonomous Navigation

SpotMini autonomously navigates a specified route through an office and lab facility. Before the test, the robot is manually driven through the space so it can build a map of the space using visual data from cameras mounted on the front, back and sides of the robot. During the autonomous run, SpotMini uses data from the cameras to localize itself in the map and to detect and avoid obstacles. Once the operator presses ‘GO’ at the beginning of the video, the robot is on its own. Total walk time for this route is just over 6 minutes. (The QR codes visible in the video are used to measure performance, not for navigation.)

Read more

Advance to Controlling one to a Few Hundred Atoms at Microsecond Timescales Using AI Control of Electron Beams

The work should lead to control one to a few hundred atoms at microsecond timescales using AI control of electron beams. The computational/analytical framework developed in this work are general and can further help develop techniques for controlling single-atom dynamics in 3D materials, and ultimately, upscaling manipulations of multiple atoms to assemble 1 to 1000 atoms with high speed and efficacy.

Scientists at MIT, the University of Vienna, and several other institutions have taken a step toward developing a method that can reposition atoms with a highly focused electron beam and control their exact location and bonding orientation. The finding could ultimately lead to new ways of making quantum computing devices or sensors, and usher in a new age of “atomic engineering,” they say.

This could help make quantum sensors and computers.

Read more

All the buzz about NASA’s new fleet of space bees

Robot bees are no replacement for our vital pollinators here on Earth. Up on the International Space Station, however, robots bearing the bee name could help spacefaring humans save precious time.

On Friday, NASA astronaut Anne McClain took one of the trio of Astrobees out for a spin. Bumble and its companion Honey both arrived on the ISS a month ago, and are currently going through a series of checks. Bumble passed the first hurdle when McClain manually flew it around the Japanese Experiment Module. Bumble took photos of the module which will be used to make a map for all the Astrobees, guiding them as they begin their tests there.

The three cube-shaped robots (Queen will arrive from Earth in the SpaceX resupply mission this July) don’t look anything like their namesakes, but they are non-threatening by design, says Astrobee project manager Maria Bualat. Since they’re built to fly around autonomously, doing tasks for the crew of the International Space Station, “one of our hardest problems is actually dealing with safety concerns,” she says.

Read more

Dog-like robot jumps, flips and trots

Putting their own twist on robots that amble through complicated landscapes, the Stanford Student Robotics club’s Extreme Mobility team has developed a four-legged robot that is not only capable of performing acrobatic tricks and traversing challenging terrain but is also designed with reproducibility in mind. Anyone who wants their own version of the robot, dubbed Stanford Doggo, can consult comprehensive plans, code and a supply list that the students have made freely available online.

“We had seen these other quadruped robots used in research, but they weren’t something that you could bring into your own lab and use for your own projects,” said Nathan Kau, ‘20, a major and lead for Extreme Mobility. “We wanted Stanford Doggo to be this open source that you could build yourself on a relatively small budget.”

Whereas other similar robots can cost tens or hundreds of thousands of dollars and require customized parts, the Extreme Mobility students estimate the cost of Stanford Doggo at less than $3,000—including manufacturing and shipping costs—and nearly all the components can be bought as-is online. They hope the accessibility of these resources inspires a community of Stanford Doggo makers and researchers who develop innovative and meaningful spinoffs from their work.

Read more

Researchers teach robots handwriting and drawing

An algorithm developed by Brown University computer scientists enables robots to put pen to paper, writing words using stroke patterns similar to human handwriting. It’s a step, the researchers say, toward robots that are able to communicate more fluently with human co-workers and collaborators.

“Just by looking at a target image of a word or sketch, the robot can reproduce each stroke as one continuous action,” said Atsunobu Kotani, an undergraduate student at Brown who led the algorithm’s development. “That makes it hard for people to distinguish if it was written by the robot or actually written by a human.”

The algorithm makes use of deep learning networks that analyze images of handwritten words or sketches and can deduce the likely series of pen strokes that created them. The robot can then reproduce the words or sketches using the pen strokes it learned. In a paper to be presented at this month’s International Conference on Robotics and Automation, the researchers demonstrate a robot that was able to write “hello” in 10 languages that employ different character sets. The robot was also able to reproduce rough sketches, including one of the Mona Lisa.

Read more

/* */