Toggle light / dark theme

Imagine this: A smooth touchscreen display placed on top of a thin silicone polymer film suddenly generates the feeling of a tiny raised button under the user’s finger. Or how about the idea of wearing that same polymer film like a second skin? If used to line an industrial glove, the film can provide valuable feedback by gesture recognition and by sending tactile signals, such as pulses or vibrations, to the wearer. The research team led by Professor Stefan Seelecke of Saarland University will be at this year’s Hannover Messe, the industrial trade fair running from 30 May to 2 June, where the team will be demonstrating how smart tactile surfaces are now being used as novel human-machine interfaces.

Seelecke’s research team at Saarland University are using thin silicone films to give surfaces some very novel capabilities. The technology, which is able to create the sensation of a tactile “button” or “slider” on flat glass display screens, is literally bringing a new dimension to touchscreen interactions. The is able to change shape on demand to create the feeling of a raised button or a key on the surface of the display that the user can then use, for example, to navigate around a page or to enter data.

“Using this technology, we can make the user interfaces of smart phones, information screens or household devices more user friendly,” said Seelecke, who heads the Intelligent Material Systems Lab at Saarland University. If a user feels a pulse or vibration under their fingertips, they can then respond by tapping the screen. And because the user also experiences the slight resistance that we feel when we press a ‘real’ button or switch, they know that their response has been successful. For the blind and partially sighted, this sort of physical feedback is not a gimmick, but hugely valuable in their day to day lives.

Researchers at Meta’s Artificial Intelligence Research Lab (Facebook) in the U.S. and at the University of Twente’s Neuromechanical Modelling and Engineering Lab in the Netherlands (led by Prof.dr.ir Massimo Sartori), have co-developed the open-source framework MyoSuite, which combines advanced musculoskeletal models with advanced artificial intelligence (AI). The AI-powered digital models in MyoSuite can learn to execute complex movements and interactions with assistive robots, that would otherwise require long experimentations on real human subjects.

Modeling and simulation are now as important to human health technologies as they have been for the advancement of modern automotive industry. Prof. Massimo Sartori: “If we could predict the outcome of a robotic therapy beforehand, then we could optimize it for a patient and deliver a truly personalized and cost-effective treatment.”

MyoSuite supports the co-simulation of AI-powered musculoskeletal systems physically interacting with such as exoskeletons. With MyoSuite you can simulate biological phenomena, e.g., muscle fatigue, muscle sarcopenia, tendon tear and tendon reaffirmation. Moreover, you can simulate how assistive robots could be designed and controlled to restore movement following impairment.

Incredible and somewhat frightening visions of the future will become a reality in the coming decades. According to futurologists, people of the future will gain immortality and will live in the body of a machine. Dr. Ian Pearson predicts that a person will be able to transfer his mind into a computer and one day he will go to a funeral where his previous biological body will be buried. Like anomalien.com on Facebook To stay in touch & get our latest news Cyborgization has some good sides. Let us take into account that we will be able to exchange each of…

This AI powered prosthetic arm understands what you think. Muscle-controlled prosthetic limbs that patients with amputations across the globe currently use have various limitations and challenges. Good quality prosthetics parts are cumbersome, come with a complex setup, and require patients to undergo training for several months to learn their use. Interestingly, a new technology proposed by a team of researchers at the University of Minnesota (UMN) can overcome all such challenges.

It may sound like science-fiction, but the researchers claim that the new technology would allow patients to control robotic body parts using their thoughts. By employing artificial intelligence and machine learning, the researchers at UMN have developed a portable neuroprosthetic hand. The robotic hand comes equipped with a nerve implant linked to the peripheral nerve in a patient’s arm.

Explaining the significance of their neuroprosthetic innovation, project collaborator and UMN neuroscientist Edward Keefer said, “We are well along the way toward allowing upper limb amputees at least, and other people in the future, to have totally natural and intuitive control of their prosthetic devices.” ## THE NEUROPROSTHETIC HAND IS DIFFERENT FROM YOUR REGULAR PROSTHETIC LIMBS

The prosthetic body parts currently available on the market detect shoulder, chest, or muscle movement. They have sensors to recognize signals in specific regions of the human body. Therefore, every time a patient wants to move his hand, he is required to trigger his body muscles. Adapting to such muscle-driven limb movement is not easy for patients, and many such devices are not suitable for physically weak individuals.

Some advanced and efficient muscle-sensitive prosthetics come with complex wiring and other arrangements that make them difficult to use. The amputees have to go through a lot of training to adjust to such devices, which often increases frustration and stress. Now imagine a device that starts working immediately, is less invasive, requires no training, no muscle activation, and no complex setup.

The neuroprosthetic arm enables the patients to move their arms simply at the will of their minds. It is an efficient, easy to use, and a lot more intuitive alternative to any commercial prosthetic system available.

The concept of Transhumanism has been around for a long time, but it actually looks like it’s starting to happen. In today’s video, we will look at how humans are already merging with machines and what will come next. IPhone wireless charging cases — https://amzn.to/3bz0oRg.
IPhone Backup — https://amzn.to/3w8Usbj.
Wireless Earbuds — https://amzn.to/2ZTjwau.
IPhone Bargains — https://amzn.to/3jXvCGb**** Gears and Equipment we use****
1. Fully Automatic Espresso Machine — https://amzn.to/3bdHcbr.
2. Perfect Desk Chair — https://amzn.to/2ZlMNd2
3. Wireless Mechanical Keyboard — https://amzn.to/3pA71Lw.
4. Wireless DTS Headphone — https://amzn.to/3juz2Qv.
5. Vocal Microphone — https://amzn.to/2XEDsN4
6. UltraWide Monitor — https://amzn.to/3jyteWg**** Free Handy Tools YOU must try ****Tubebuddy – https://bit.ly/3y0SOc6
In-depth channel/video analysis, SEO statistics, Channel optimizationHere at Future Now we aim to bring you the most informative, fascinating and engaging Technology videos on YouTube.🔔 Subscribe To Our Channel: https://bit.ly/3nRoVH8
Copyright Disclaimer:
Under section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, education and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use. *DISCLOSURE*
We are affiliated, but not sponsored by any product featured in this video.
Some links in the description are affiliate links to products, which means if you click on them and buy the product, we will receive a small commission.
Not being sponsored allows us to keep our own opinions and provide product reviews without bias. From the millions of products, we appreciate your support.#Exoskeleton #Cyborg #Transhumanism

Modern life can be full of baffling encounters with artificial intelligence—think misunderstandings with customer service chatbots or algorithmically misplaced hair metal in your Spotify playlist. These AI systems can’t effectively work with people because they have no idea that humans can behave in seemingly irrational ways, says Mustafa Mert Çelikok. He’s a Ph.D. student studying human-AI interaction, with the idea of taking the strengths and weaknesses of both sides and blending them into a superior decision-maker.

In the AI world, one example of such a hybrid is a “centaur.” It’s not a mythological horse–, but a human-AI team. Centaurs appeared in chess in the late 1990s, when systems became advanced enough to beat human champions. In place of a “human versus machine” matchup, centaur or cyborg chess involves one or more computer chess programs and human players on each side.

“This is the Formula 1 of chess,” says Çelikok. “Grandmasters have been defeated. Super AIs have been defeated. And grandmasters playing with powerful AIs have also lost.” As it turns out, novice players paired with AIs are the most successful. “Novices don’t have strong opinions” and can form effective decision-making partnerships with their AI teammates, while “grandmasters think they know better than AIs and override them when they disagree—that’s their downfall,” observes Çelikok.