New, ultrafast and energy-efficient, light-based communication networks could make remote robots a reality — with huge implications for surgery, construction and logistics.
Researchers at ETH Zurich have developed artificial muscles that contain microbubbles and can be controlled with ultrasound. In the future, these muscles could be deployed in technical and medical settings as gripper arms, tissue patches, targeted drug delivery, or robots.
It might look like a simple material experiment at first glance, as a brief ultrasound stimulation induces a thin strip of silicone to start bending and arching. But that’s just the beginning. A team led by Daniel Ahmed, Professor of Acoustic Robotics for Life Sciences and Healthcare, has developed a new class of artificial muscles: flexible membranes that respond to ultrasound with the help of thousands of microbubbles.
The work is published in the journal Nature.
In an ideal world, an AI model looking for new materials to build better batteries would be trained on millions or even hundreds of millions of data points.
But for emerging next-generation battery chemistries that don’t have decades of research behind them, waiting for new studies takes time the world doesn’t have.
“Each experiment takes up to weeks, months to get data points,” said University of Chicago Pritzker School of Molecular Engineering (UChicago PME) Schmidt AI in Science Postdoctoral Fellow Ritesh Kumar. “It’s just infeasible to wait until we have millions of data to train these models.”
👉 Click the link https://TryFum.com/BLACKPIGEON or scan the QR code on the screen to get yours today!
✅ Superchat: https://entropystream.live/FelixRexShow.
✅ Support Directly: https://felixrex.net/membership-tiers/
✅ Support via SubscribeStar: https://www.subscribestar.com/black-p… Support via Locals: https://felixrex.locals.com/ ✅ Tip Jar: via PayPal to: [email protected] ✅ Buy me a coffee: https://ko-fi.com/felixrex.
🔵BTC (Bitcoin) 3NiWatW8cAdGQChcbL9tCickW9JvouTs3L 🔵BCH (Bitcoin Cash) 35GrMSvJHHQ5DvCcNfJNe6Pj46w6HbckoF 🔵LTC (Litecoin) MLbo7xkPJjRX9wFYxHG9YajWwbB14rmpJx ✅ Your SUPPORT of this Channel is GREATLY appreciated.
🔴 All Social Media can be found at this link: https://felixrex.net/links/ 🔴 2nd Channel BPS: / @felix_rex 🔴 Telegram: https://t.me/felixrex 🔴 Rumble: https://rumble.com/user/FelixRex 🔴 Twitter:
/ navyhato 🔴 Instagram:
/ blackpigeonspeaks 🔴 Facebook:
/ blackpigeonspeaks 🔴 Minds: https://www.minds.com/blackpigeonspeaks 🔴 BitChute: https://www.bitchute.com/profile/bBzm… 🔴 Gab: https://gab.ai/blackpigeon 🔴 BPS Odysee: https://odysee.com/@BlackPigeonSpeaks:c 🔴 Felix Rex Odysee: https://odysee.com/@felixrex:2 🔴 Dlive Livestreaming: https://dlive.tv/Felix_Rex 🔴 Trovo Livestreaming: https://trovo.live/FelixRex #israelpalestineconflict #gaza #trumppeaceplan.
✅ Support via Locals: https://felixrex.locals.com/
✅ Tip Jar: via PayPal to: [email protected].
✅ Buy me a coffee: https://ko-fi.com/felixrex.
🔵BTC (Bitcoin)
3NiWatW8cAdGQChcbL9tCickW9JvouTs3L
🔵BCH (Bitcoin Cash)
35GrMSvJHHQ5DvCcNfJNe6Pj46w6HbckoF
🔵LTC (Litecoin)
MLbo7xkPJjRX9wFYxHG9YajWwbB14rmpJx.
✅ Your SUPPORT of this Channel is GREATLY appreciated.
In this latest edition of Security & Tech Insights newsletter, the topic of vulnerabilities of digital connectivity are analyzed in special regards to IoT, Smart Cities, and Space. Also included are articles reviewing Cybersecurity Awareness and Preparedness, and new threats to contend with from AI-enabled Ransomware. Thanks for reading and sharing! Chuck Brooks.
#cybersecurity #internetofthings #smartcities #space #ai #ransomware | on LinkedIn.
The difference between the brains of children who read books (left picture) and screen time (right picture) over 1 hour. Early childhood, screen time over 60 minutes, are vulnerable to emotional and focus disorders. Increasing the duration of screen time reduces brain connectivity in the language, visual and intelligence centres compared to reading books.
The school bell rang long ago, but Danny is still sitting in his chair, trying to finish copying from the board. “Why is this process so hard? Why does it take me so much longer to read than it takes my friends?” Danny is frustrated. The more he tries to read faster, the harder it is for him to understand what he is reading. Around the time when he finally finishes copying, his friends come back to the class from the break. Like 10–15% of the children in the world, Danny has dyslexia. Dyslexia is defined as difficulty in reading accurately or quickly and, most of the time; it affects the person’s ability to understand what is read and to spell words correctly. The reading difficulty continues into adulthood and does not disappear, even though most adults with dyslexia find ways to “bypass” this difficulty, sometimes using text-to-speech software. Children and adults with dyslexia have different brain activity than do people who are good readers. They have lower activity in the brain area responsible for vision and identification of words [ 1, 2 ] and in another brain area responsible for attention and recognition of errors during reading [ 3 ]. A question could then be asked: is this reading difficulty strange or is it actually the ability to read that is magical? How did the human brain learn to read? And does the daily use of technology, which sometimes “bypasses” the need to make an effort to read, help us to learn to read or make it more difficult? This article will discuss these subjects.
Reading is a relatively new human ability—about 5,000 years old. The Egyptians were among the first to use symbols to represent words within a spoken language, and they used drawings to transmit ideas via writing. As difficult as it is to draw each word in a language, it is still much easier to understand Egyptian hieroglyphs than to figure out what is written in an unfamiliar language. Today, 5,000 years later, we expect each child in first grade to immediately understand that the lines and circles that form letters have a unique sound corresponding to them. To do that, the brain has to rely on neural networks that were designed to perform other tasks, such as seeing, hearing, language comprehension, speech, attention, and concentration [ 4 ] (see Figure 1).
In recent years, neuroengineers have devised a number of new modalities for interfacing with the nervous system. Among these are optical stimulation, vibrational stimulation, and optogenetics. A newer and perhaps more promising technology is sonogenetics.
Sonogenetics, the use of focused ultrasound to control cells that have been made ultrasound-responsive via gene delivery, is moving from compelling papers to a potential platform strategy. From a neurotech commercialization standpoint, the significance of sonogenetics is less about a single lab trick and more about the emerging convergence of three capabilities: precise genetic targeting, durable and safe delivery, and field-robust ultrasound systems that work the first time outside the origin lab.
One commercial firm that may be exploiting this technology is Merge Labs. The startup recently made a big splash with a $250 million investment from Open AI and Sam Altman. While the company has not yet released its website and the technical personnel behind the company have not been identified, it is rumored to be working with focused ultrasound implants and sonogenetics as gene therapy. If Merge and its peers can validate durable expression, predictable dose–response, and reliable outside-the-lab bring-up, a first wave of indications will likely sit at the intersection of neurology, psychiatry, and rehabilitation, with longer-term spillover into human-machine interaction.