Toggle light / dark theme

If you’ve ever played the claw game at an arcade, you know how hard it is to grab and hold onto objects using robotics grippers. Imagine how much more nerve-wracking that game would be if, instead of plush stuffed animals, you were trying to grab a fragile piece of endangered coral or a priceless artifact from a sunken ship.

Most of today’s robotic grippers rely on embedded sensors, complex feedback loops, or advanced machine learning algorithms, combined with the skill of the operator, to grasp fragile or irregularly shaped objects. But researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have demonstrated an easier way.

Taking inspiration from nature, they designed a new type of soft, robotic that uses a collection of thin tentacles to entangle and ensnare objects, similar to how jellyfish collect stunned prey. Alone, individual tentacles, or filaments, are weak. But together, the collection of filaments can grasp and securely hold heavy and oddly shaped objects. The gripper relies on simple inflation to wrap around objects and doesn’t require sensing, planning, or feedback control.

Register now for your free virtual pass to the Low-Code/No-Code Summit this November 9. Hear from executives from Service Now, Credit Karma, Stitch Fix, Appian, and more. Learn more.

Retail AI is everywhere this holiday season — even if you don’t realize it.

Say you’re a fashion retailer. You’ve always had to try to predict trends — but now with a slowed supply chain, you have to look 12 months out instead of six.

Aside from the open-sourced nature of the project, the possible widespread applications of the technology also makes it noteworthy. It could be a plausible alternative to mechanical traps, as well as chemicals that often damage the environment and target non-pest insect species. Not to mention, it’s cheaper (the paper notes that all devices cost not more than $250) and more compact than other current pest-controlling technologies.

Sign up for Motherboard’s daily newsletter for a regular dose of our original reporting, plus behind-the-scenes content about our biggest stories.

That being said, although the prototype is suitable for academic research, there’s a lot more to be done before it can be deployed on a larger scale. For example, the paper notes that a smaller laser point would be more effective at killing the roaches but is difficult to implement experimentally. The ability to precisely control which parts of the cockroach’s bodies were hit would also be helpful, the paper says.

“We got a thousand times improvement [in training performance per chip] over the last 10 years, and a lot of it has been due to number representation,” Bill Dally, chief scientist and senior vice president of research at Nvidia said at the recent IEEE Symposium on Computer Arithmetic.

Pure water is an almost perfect insulator.

Yes, water found in nature conducts electricity – but that’s because of the impurities therein, which dissolve into free ions that allow an electric current to flow. Pure water only becomes “metallic” – electronically conductive – at extremely high pressures, beyond our current abilities to produce in a lab.

But, as researchers demonstrated for the first time back in 2021, it’s not only high pressures that can induce this metallicity in pure water.

For proper operation, drones usually use accelerometers to determine the direction of gravity. In a new study published in Nature on October 19, 2022, a team of scientists from Delft University of Technology, the CNRS and Aix-Marseille University has shown that drones can estimate the direction of gravity by combining visual detection of movement with a model of how they move. These results may explain how flying insects determine the direction of gravity and are a major step toward the creation of tiny autonomous drones.

While drones typically use accelerometers to estimate the direction of , the way flying achieve this has been shrouded in mystery until now, as they have no specific sense of acceleration. In this study, a European team of scientists led by the Delft University of Technology in the Netherlands and involving a CNRS researcher has shown that drones can assess gravity using visual motion detection and motion modeling together.

To develop this new principle, scientists have investigated optical flow, that is, how an individual perceives movement relative to their environment. It is the visual movement that sweeps across our retina when we move. For example, when we are on a train, trees next to the tracks pass by faster than distant mountains. The optical flow alone is not enough for an insect to be able to know the direction of gravity.