Toggle light / dark theme

A synthetic skin for prosthetics limbs that can generate its own energy from solar power has been developed by engineers from Glasgow University.

Researchers had already created an ‘electronic skin’ for prosthetic hands made with new super-material graphene.

The new skin was much more sensitive to touch but needed a power source to operate its sensors.

Exoskeleton for real world adoption.

A super smart or “learned” controller that leverages data-intensive artificial intelligence (AI) and computer simulations to train portable, robotic exoskeletons.

This new controller provides smooth, continuous torque assistance for walking, running, or climbing…


Researchers at Cambridge have shown that the Third Thumb, a robotic prosthetic, can be quickly mastered by the public, enhancing manual dexterity. The study stresses the importance of inclusive design to ensure technologies benefit everyone, with significant findings on performance across different demographics.

Cambridge researchers demonstrated that people can rapidly learn to control a prosthetic extra thumb, known as a “third thumb,” and use it effectively to grasp and handle objects.

The team tested the robotic device on a diverse range of participants, which they say is essential for ensuring new technologies are inclusive and can work for everyone.

Current AI training methods burn colossal amounts of energy to learn, but the human brain sips just 20 W. Swiss startup FinalSpark is now selling access to cyborg biocomputers, running up to four living human brain organoids wired into silicon chips.

The human brain communicates within itself and with the rest of the body mainly through electrical signals; sights, sounds and sensations are all converted into electrical pulses before our brains can perceive them. This makes brain tissue highly compatible with silicon chips, at least for as long as you can keep it alive.

For FinalSpark’s Neuroplatform, brain organoids comprising about 10,000 living neurons are grown from stem cells. These little balls, about 0.5 mm (0.02 in) in diameter, are kept in incubators at around body temperature, supplied with water and nutrients and protected from bacterial or viral contamination, and they’re wired into an electrical circuit with a series of tiny electrodes.

Researchers have tested a range of neuroprosthetic devices, from wheelchairs to robots to advanced limbs, that work with their users to intelligently perform tasks.

They work by decoding brain signals to determine the actions their users want to take, and then use advanced robotics to do the work of the spinal cord in orchestrating the movements. The use of shared control — new to neuroprostheses — “empowers users to perform complex tasks,” says José del R. Millán, who presented the new work at the Cognitive Neuroscience Society (CNS) conference in San Francisco today.

Millán, of the Swiss Federal Institute of Technology in Lausanne, Switzerland, began working on “brain-computer interfaces” (BCIs), designing devices that use people’s own brain activity to restore hand grasping and locomotion, or provide mobility via wheelchairs or telepresence robots, using people’s own brain activity.

While wearable technologies with embedded sensors, such as smartwatches, are widely available, these devices can be uncomfortable, obtrusive and can inhibit the skin’s intrinsic sensations.

“If you want to accurately sense anything on a biological surface like skin or a leaf, the interface between the device and the surface is vital,” said Professor Yan Yan Shery Huang from Cambridge’s Department of Engineering, who led the research. “We also want bioelectronics that are completely imperceptible to the user, so they don’t in any way interfere with how the user interacts with the world, and we want them to be sustainable and low waste.”

There are multiple methods for making wearable sensors, but these all have drawbacks. Flexible electronics, for example, are normally printed on plastic films that don’t allow gas or moisture to pass through, so it would be like wrapping your skin in plastic film. Other researchers have recently developed flexible electronics that are gas-permeable, like artificial skins, but these still interfere with normal sensation, and rely on energy-and waste-intensive manufacturing techniques.

Retinitis pigmentosa and macular degeneration lead to photoreceptor death and loss of visual perception. Despite recent progress, restorative technologies for photoreceptor degeneration remain largely unavailable. Here, we describe a novel optogenetic visual prosthesis (FlexLED) based on a combination of a thin-film retinal display and optogenetic activation of retinal ganglion cells (RGCs). The FlexLED implant is a 30 µm thin, flexible, wireless µLED display with 8,192 pixels, each with an emission area of 66 µm2. The display is affixed to the retinal surface, and the electronics package is mounted under the conjunctiva in the form factor of a conventional glaucoma drainage implant. In a rabbit model of photoreceptor degeneration, optical stimulation of the retina using the FlexLED elicits activity in visual cortex. This technology is readily scalable to hundreds of thousands of pixels, providing a route towards an implantable optogenetic visual prosthesis capable of generating vision by stimulating RGCs at near-cellular resolution.

### Competing Interest Statement.

All authors have a financial interest in Science Corporation.

Creating robots to safely aid disaster victims is one challenge; executing flexible robot control that takes advantage of the material’s softness is another. The use of pliable soft materials to collaborate with humans and work in disaster areas has drawn much recent attention. However, controlling soft dynamics for practical applications has remained a significant challenge.

In collaboration with the University of Tokyo and Bridgestone Corporation, Kyoto University has now developed a method to control pneumatic artificial muscles, which are soft robotic actuators. Rich dynamics of these drive components can be exploited as a computational resource.

Artificial muscles control rich soft component dynamics by using them as a computational resource. (Image: MEDICAL FIG.)

In organisms, fluid is what binds the organs, the and the musculoskeletal system as a whole. For example, hemolymph, a blood-like fluid in a spider’s body, enables muscle activation and exoskeleton flexibility. It was the cucumber spider inhabiting Estonia that inspired scientists to create a complex , where soft and rigid parts are made to work together and are connected by a liquid.

According to Indrek Must, Associate Professor of Soft Robotics, the designed soft robot is based on real reason. “Broadly speaking, our goal is to build systems from both natural and artificial materials that are as effective as in wildlife. The robotic leg could touch delicate objects and move in the same complex environments as a living spider,” he explains.

In a published in the journal Advanced Functional Materials, the researchers show how a robotic foot touches a primrose stamen, web, and pollen grain. This demonstrates the soft robot’s ability to interact with very small and delicate structures without damaging them.