Toggle light / dark theme

Over the past few decades, roboticists and computer scientists have developed artificial systems that replicate biological functions and human abilities in increasingly realistic ways. This includes artificial intelligence systems, as well as sensors that can capture various types of sensory data.

When trying to understand properties of objects and how to grasp them or handle them, humans often rely on their sense of touch. Artificial sensing systems that replicate human touch can thus be of great value, as they could enable the development of better performing and more responsive robots or prosthetic limbs.

Researchers at Sungkyunkwan University and Hanyang University in South Korea have recently created an artificial tactile sensing system that mimics the way in which humans recognize objects in their surroundings via their sense of touch. This system, presented in a paper published in Nature Electronics, uses to capture data associated with the tactile properties of objects.

Imagine clothing that can warm or cool you, depending on how you’re feeling. Or artificial skin that responds to touch, temperature, and wicks away moisture automatically. Or cyborg hands controlled with DNA motors that can adjust based on signals from the outside world.

Welcome to the era of intelligent matter—an unconventional AI computing idea directly woven into the fabric of synthetic matter. Powered by brain-based computing, these materials can weave the skins of soft robots or form microswarms of drug-delivering nanobots, all while reserving power as they learn and adapt.

Sound like sci-fi? It gets weirder. The crux that’ll guide us towards intelligent matter, said Dr. W.H.P. Pernice at the University of Munster and colleagues, is a distributed “brain” across the material’s “body”— far more alien than the structure of our own minds.

In the very last moments of the movie, however, you would also see something unusual: the sprouting of clouds of satellites, and the wrapping of the land and seas with wires made of metal and glass. You would see the sudden appearance of an intricate artificial planetary crust capable of tremendous feats of communication and calculation, enabling planetary self-awareness — indeed, planetary sapience.

The emergence of planetary-scale computation thus appears as both a geological and geophilosophical fact. In addition to evolving countless animal, vegetal and microbial species, Earth has also very recently evolved a smart exoskeleton, a distributed sensory organ and cognitive layer capable of calculating things like: How old is the planet? Is the planet getting warmer? The knowledge of “climate change” is an epistemological accomplishment of planetary-scale computation.

Over the past few centuries, humans have chaotically and in many cases accidentally transformed Earth’s ecosystems. Now, in response, the emergent intelligence represented by planetary-scale computation makes it possible, and indeed necessary, to conceive an intentional, directed and worthwhile planetary-scale terraforming. The vision for this is not to be found in computing infrastructure itself, but in the purposes to which we put it.

Stimulation of the nervous system with neurotechnology has opened up new avenues for treating human disorders, such as prosthetic arms and legs that restore the sense of touch in amputees, prosthetic fingertips that provide detailed sensory feedback with varying touch resolution, and intraneural stimulation to help the blind by giving sensations of sight.

Scientists in a European collaboration have shown that optic nerve stimulation is a promising neurotechnology to help the blind, with the constraint that current technology has the capacity of providing only simple visual signals.

Nevertheless, the scientists’ vision (no pun intended) is to design these simple visual signals to be meaningful in assisting the blind with daily living. Optic nerve stimulation also avoids invasive procedures like directly stimulating the brain’s visual cortex. But how does one go about optimizing stimulation of the optic nerve to produce consistent and meaningful visual sensations?

Now, the results of a collaboration between EPFL, Scuola Superiore Sant’Anna and Scuola Internazionale Superiore di Studi Avanzati, published today in Patterns, show that a new stimulation protocol of the optic nerve is a promising way for developing personalized visual signals to help the blind–that also take into account signals from the visual cortex. The protocol has been tested for the moment on artificial neural networks known to simulate the entire visual system, called convolutional neural networks (CNN) usually used in computer vision for detecting and classifying objects. The scientists also performed psychophysical tests on ten healthy subjects that imitate what one would see from optic nerve stimulation, showing that successful object identification is compatible with results obtained from the CNN.

“We are not just trying to stimulate the optic nerve to elicit a visual perception,” explains Simone Romeni, EPFL scientist and first author of the study. “We are developing a way to optimize stimulation protocols that takes into account how the entire visual system responds to optic nerve stimulation.”

Hugh Herr is building the next generation of bionic limbs, robotic prosthetics inspired by nature’s own designs. Herr lost both legs in a climbing accident 30 years ago; now, as the head of the MIT Media Lab’s Biomechatronics group, he shows his incredible technology with the help of ballroom dancer Adrianne Haslet-Davis, who lost her left leg in the 2013 Boston Marathon bombing.

✅ Instagram: https://www.instagram.com/pro_robots.

You are on the Pro Robot channel and today we are going to talk about the soldiers of the future. Exoskeletons, ballistic helmets, military suits, chips and more are already being introduced into the armaments of different countries. In this issue we will find out what the super-soldier of the future will be like and what developments are being conducted in the military industry. Watch the video to the end and write your opinion in the comments: will robots replace humans in military service?

0:00 In this video.
0:30 Combat glasses.
2:26 Devtac Ronin Kevlar ballistic helmet.
3:00 STILE smart fabric.
3:42 Stealth Cloak.
4:10 Future Soldier System Full Suit.
5:15 Sotnik Suit.
5:55 Exoskeleton Military.
6:32 PowerWalk current generator exoskeletons.
7:00 Human Universal Load Carrier exoskeleton with hydraulic drive.
7:24 A Flying Suit for Military.
7:48 Jetpack.
8:09 Invasive chips and genetic engineering.
9:02 Man-Made Lightning.

More interesting and useful content:

“I didn’t think I would be emotional about this.”


It’s not just humans that use prosthetic limbs—wounded or disabled animals can benefit from them, too. In the past, we’ve reported on cats, dogs, and even an elephant who have been fitted for prosthesis. The latest creature who’s now learning to walk on an artificial foot is an adorable duck named Waddles.

Waddles was born with a deformed leg, but his adoptive owner Ben Weinman wanted to help him live a better life. He contacted Derrick Campana, a Certified Pet Prostheticist at Bionic Pets who made a 3D-printed prosthetic leg and foot.

A clip from the NatGeo Wild series, The Wizard of Paws, was recently shared online, revealing the heartwarming moment when Waddles was fitted with his new leg. At first, he’s not quite sure what to make of it, but after a little encouragement from Weinman and Campana, he starts happily toddling along on both feet.

😃


✅ Instagram: https://www.instagram.com/pro_robots.

You’re on the PRO Robotics channel and in this issue of High Tech News. The latest news from Mars, the first flight of Elon Musk’s starship around the Earth, artificial muscles, a desktop bioprinter and why IBM teaches artificial intelligence to code? All the most interesting technology news in one issue!

UCL researchers have created a strange robotic “third thumb” that attaches to the hand and adds a large extra digit on the opposite side of the hand from the thumb. Researchers found that using the robotic thumb can impact how the hand is represented in the brain. For the research, scientists trained people to use an extra robotic thumb and found they could effectively carry out dexterous tasks such as building a tower of blocks using a single hand with two thumbs.

Researchers said that participants trained to use the extra thumb increasingly felt like it was part of their body. Initially, the Third Thumb was part of a project seeking to reframe the way people view prosthetics from replacing a lost function to becoming an extension of the human body. UCL Professor Tamar Makin says body augmentation is a growing field aimed at extending the physical abilities of humans.

Using a robotic ‘Third Thumb’ can impact how the hand is represented in the brain, finds a new study led by UCL researchers.

The team trained people to use a robotic extra and found they could effectively carry out dextrous tasks, like building a tower of blocks, with one hand (now with two thumbs). The researchers report in the journal Science Robotics that participants trained to use the thumb also increasingly felt like it was a part of their body.

Designer Dani Clode began developing the device, called the Third Thumb, as part of an award-winning graduate project at the Royal College of Art, seeking to reframe the way we view prosthetics, from replacing a lost function, to an extension of the human body. She was later invited to join Professor Tamar Makin’s team of neuroscientists at UCL who were investigating how the can adapt to body augmentation.