Toggle light / dark theme

Underwater robots are being widely used as tools in a variety of marine tasks. The RobDact is one such bionic underwater vehicle, inspired by a fish called Dactylopteridae known for its enlarged pectoral fins. A research team has combined computational fluid dynamics and a force measurement experiment to study the RobDact, creating an accurate hydrodynamic model of the RobDact that allows them to better control the vehicle.

The team published their findings in Cyborg and Bionic Systems on May 31, 2022.

Underwater robots are now used for many marine tasks, including in the fishery industry, underwater exploration, and mapping. Most of the traditional underwater robots are driven by a propeller, which is effective for cruising in at a stable speed. However, underwater robots often need to be able to move or hover at low speeds in turbulent waters, while performing a specific task. It is difficult for the propeller to move the robot in these conditions. Another factor when an is moving at low speeds in unstable flowing waters is the propeller’s “twitching” movement. This twitching generates unpredictable fluid pulses that reduce the robot’s efficiency.

A Brain-Computer Interface (BCI) is a promising technology that has received increased attention in recent years. BCIs create a direct link from your brain to a computer. This technology has applications to many industries and sectors of our life. BCIs redefine how we approach medical treatment and communication for individuals with various conditions or injuries. BCIs also have applications in entertainment, specifically video games and VR. From being able to control a prosthetic limb with your mind, to being able to play a video game with your mind—the potential of BCIs are endless.

What are your thoughts on Brain-Computer Interfaces? Let us know!
Any disruptive technologies you would like us to cover? Dm us on our Instagram (@toyvirtualstructures).
—————–
Check out our curated playlists:
https://www.youtube.com/channel/UCr5Akn6LhGDin7coWM7dfUg/playlists.
—————–
Media Used:

Tom Oxley | TED

—————–
Want more content?
Check us out at:
Instagram: @toyvirtualstructures.
Twitter: @web3tvs

The chip is an artificial neuron, but nothing like previous chips built to mimic the brain’s electrical signals. Rather, it adopts and adapts the brain’s other communication channel: chemicals.

Called neurotransmitters, these chemicals are the brain’s “natural language,” said Dr. Benhui Hu at Nanjing Medical University in China. An artificial neuron using a chemical language could, in theory, easily tap into neural circuits—to pilot a mouse’s leg, for example, or build an entirely new family of brain-controlled prosthetics or neural implants.

A new study led by Hu and Dr. Xiaodong Chen at Nanyang Technological University, Singapore, took a lengthy stride towards seamlessly connecting artificial and biological neurons into a semi-living circuit. Powered by dopamine, the setup wasn’t a simple one-way call where one component activated another. Rather, the artificial neuron formed a loop with multiple biological counterparts, pulsing out dopamine while receiving feedback to change its own behavior.

Wearable human-machine interface devices, HMIs, can be used to control machines, computers, music players, and other systems. A challenge for conventional HMIs is the presence of sweat on human skin.

In Applied Physics Reviews, scientists at UCLA describe their development of a type of HMI that is stretchable, inexpensive, and waterproof. The device is based on a soft magnetoelastic sensor array that converts mechanical pressure from the press of a finger into an .

The device involves two main components. The first component is a layer that translates mechanical movement to a magnetic response. It consists of a set of micromagnets in a porous silicone matrix that can convert the gentle fingertip pressure into a magnetic field variation.

New AI supercomputer from Graphcore will have 500 trillion parameters, (5x that of human brain) and compute at a speed of 10 exaflops per second (10x that of human brain) for a cost of $120 million USD. New AI powered exoskeleton uses machine learning to help patients walk. AI detects diabetes and prediabetes using machine learning to identify ECG signals indicative of the disease. AI identifies cancerous lesions in IBD patients.

AI News Timestamps:
0:00 New AI Supercomputer To Beat Human Brain.
3:06 AI Powered Exoskeleton.
4:35 AI Predicts Diabetes.
6:55 AI Detects Cancerous Lesions For IBD

👉 Crypto AI News: https://www.youtube.com/c/CryptoAINews/videos.

#ai #news #supercomputer

A year after Tesla announced its humanoid robot — the Tesla Bot — the conceptual general-purpose robot is up against some Chinese competition. On the sidelines of Xiaomi’s Autumn launch event in Beijing, the company announced its first full-size humanoid bionic robot. The rather unimaginatively named Xiaomi CyberOne is the second robotic product from Xiaomi and comes a year after the announcement of the Xiaomi Cyberdog, which they showcased at their 2021 Autumn launch event.

CyberOne
Xiaomi.

Like most other humanoid robots, most aspects of the Xiaomi CyberOne are still “work in progress.” Xiaomi claims that future, evolved variants of the robot will not only have a high degree of emotional intelligence but will also gain the ability to perceive human emotions. Despite the fact that the first-generation CyberOne demoed on stage seemed to have trouble walking, work is underway to improve its ability to master the art of bipedal movement.

A pair of UCLA bioengineers and a former postdoctoral scholar have developed a new class of bionic 3D camera systems that can mimic flies’ multiview vision and bats’ natural sonar sensing, resulting in multidimensional imaging with extraordinary depth range that can also scan through blind spots.

Powered by computational image processing, the camera can decipher the size and shape of objects hidden around corners or behind other items. The technology could be incorporated into autonomous vehicles or medical imaging tools with sensing capabilities far beyond what is considered state of the art today. This research has been published in Nature Communications.

In the dark, bats can visualize a vibrant picture of their surroundings by using a form of echolocation, or sonar. Their high-frequency squeaks bounce off their surroundings and are picked back up by their ears. The minuscule differences in how long it takes for the echo to reach the nocturnal animals and the intensity of the sound tell them in real time where things are, what’s in the way and the proximity of potential prey.

In recent years, material scientists have designed a wide range of innovative materials that could be used to create new technologies, including soft robots, controllers and smart textiles. These materials include artificial muscles, structures that resemble biological muscles in shape and that could improve the movements of robots or enable the creation of clothing that adapts to different environmental conditions.

As part of an ongoing project focused on textile-based , a team of researchers at Jiangnan University in China recently developed new artificial muscles based on free-standing, single-helical woolen . Their artificial muscles, introduced in a paper published in Smart Materials and Structures, could be used to easily and affordably produce twisted actuators that can detect and respond to humidity in their environment.

“We are trying to design flexible and versatile actuators by leveraging the hierarchical structure design of textiles, ranging from microscales (e.g., molecular chains and aggregation structures) to macroscales (e.g., fiber morphology and textile architectures),” Fengxin Sun, one of the researchers who carried out the study, told Tech Xplore. “Realizing a yarn-based artificial muscle with free-standing and single-helical architecture via eco-friendly and easy-fabrication manufacturing process is still challenging.”