Toggle light / dark theme

EPFL scientists are developing new approaches for improved control of robotic hands—in particular for amputees—that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof of concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. The results are published in today’s issue of Nature Machine Intelligence.

The technology merges two concepts from two different fields. Implementing them both together had never been done before for robotic hand control, and contributes to the emerging field of shared control in neuroprosthetics.

One concept, from neuroengineering, involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand which has never before been done. The other, from robotics, allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping.

Would you consent to a surveillance system that watches without video and listens without sound?

If your knee-jerk reaction is “no!”, then “huh?” I’m with you. In a new paper in Applied Physics Letters, a Chinese team is wading into the complicated balance between privacy and safety with computers that can echolocate. By training AI to sift through signals from arrays of acoustic sensors, the system can gradually learn to parse your movements—standing, sitting, falling—using only ultrasonic sound.

To study author Dr. Xinhua Guo at the Wuhan University of Technology, the system may be more palatable to privacy advocates than security cameras. Because it relies on ultrasonic waves—the type that bats use to navigate dark spaces—it doesn’t capture video or audio. It’ll track your body position, but not you per se.

A bio-inspired bot uses water from the environment to create a gas and launch itself from the water’s surface.

The robot, which can travel 26 metres through the air after take-off, could be used to collect in hazardous and cluttered environments, such as during flooding or when monitoring .

Robots that can transition from water to air are desirable in these situations, but the launch requires a lot of power, which has been difficult to achieve in .

ARLINGTON, Va. — The Army successfully tested its ability to redirect munitions in flight Aug. 28 in an experiment over the Mohave Desert involving an unmanned aircraft, smart sensors and artificial intelligence.

It was the “signature experiment for FY19” said Brig. Gen. Walter T. Rugen, director of the Future Vertical Lift Cross-Functional Team, speaking Thursday at the Association of the U.S. Army’s “Hot Topic” forum on aviation.

The experiment at Naval Air Weapons Station China Lake, California, tested a capability developed by his CFT called A3I, standing for Architecture, Automation, Autonomy and Interfaces.

Sept. 10 (UPI) — McDonald’s on Tuesday announced the acquisition of a company that will assist in automating its drive-through process.

The fast-food chain agreed to a deal to acquire Apprente, a California-based company that was founded in 2017 with a focus on creating voice-based platforms for “complex, multilingual, multi-accent and multi-item conversational ordering.”

McDonald’s said Apprente’s technology will be used to allow for faster, simpler and more accurate order taking at its drive-throughs and may later be incorporated into mobile ordering and kiosks.