Toggle light / dark theme

Scientists Just Merged Human Brain Cells With AI – Here’s What Happened!
What happens when human brain cells merge with artificial intelligence? Scientists have just achieved something straight out of science fiction—combining living neurons with AI to create a hybrid intelligence system. The results are mind-blowing, and they could redefine the future of computing. But how does it work, and what does this mean for humanity?

In a groundbreaking experiment, researchers successfully integrated human brain cells with AI, creating a system that learns faster and more efficiently than traditional silicon-based computers. These “biocomputers” use lab-grown brain organoids to process information, mimicking human thought patterns while leveraging AI’s speed and scalability. The implications? Smarter, more adaptive machines that think like us.

Why is this such a big deal? Unlike conventional AI, which relies on brute-force data crunching, this hybrid system operates more like a biological brain—learning with less energy, recognizing patterns intuitively, and even showing early signs of creativity. Potential applications include ultra-fast medical diagnostics, self-improving robots, and brain-controlled prosthetics that feel truly natural.

But with great power comes big questions. Could this lead to conscious machines? Will AI eventually surpass human intelligence? And what are the ethical risks of blending biology with technology? This video breaks down the science, the possibilities, and the controversies—watch to the end for the full story.

How did scientists merge brain cells with AI? What are biocomputers? Can AI become human-like? What is hybrid intelligence? Will AI replace human brains?This video will answer all these question. Make sure you watch all the way though to not miss anything.

#ai.

How does a robotic arm or a prosthetic hand learn a complex task like grasping and rotating a ball? The challenge for the human, prosthetic or robotic hand has always been to correctly learn to control the fingers to exert forces on an object.

The and nerve endings that cover our hands have been attributed with helping us learn and adapt to our manipulation, so roboticists have insisted on incorporating sensors into robotic hands. But–given that you can still learn to handle objects with gloves on– there must be something else at play.

This mystery is what inspired researchers in the ValeroLab in the Viterbi School of Engineering to explore if tactile sensation is really always necessary for learning to control the fingers.

Naturalistic communication is an aim for neuroprostheses. Here the authors present a neuroprosthesis that restores the voice of a paralyzed person simultaneously with their speaking attempts, enabling naturalistic communication.

Imagine navigating a virtual reality with contact lenses or operating your smartphone underwater: This and more could soon be a reality thanks to innovative e-skins.

A research team led by the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has developed an that detects and precisely tracks magnetic fields with a single global sensor. This artificial skin is not only light, transparent and permeable, but also mimics the interactions of real skin and the brain, as the team reports in the journal Nature Communications.

Originally developed for robotics, e-skins imitate the properties of real skin. They can give robots a or replace lost senses in humans. Some can even detect chemical substances or magnetic fields. But the technology also has its limits. Highly functional e-skins are often impractical because they rely on extensive electronics and large batteries.

Try brilliant FREE for 30 days: https://brilliant.org/ihm/
And get 20% off an annual membership!

Can you implant lab-grown brain tissue to heal brain damage? Kind of. What if you also implant an electrical stimulation device? The next generation of brain implants may be the Organoid Brain-Computer Interface (OBCI).

Learn about: brain organoids, dendritic spines, synapses, presynaptic and postsynaptic neurons, neurotransmitters.

Story of Einstein’s Brain: https://www.npr.org/2005/04/18/4602913/the-long-strange-jour…eins-brain

Fusion is inching closer to reality with continuous development in this field as the United States Domestic Agency for the International Thermonuclear Experimental Reactor (ITER) recently completed the delivery of critical components for the support structure of central solenoid.

Described as an exoskeleton, or a cage, the support structure surrounds the central solenoid, which is a 60-foot-tall superconducting magnet at the heart of the ITER fusion machine.

It’s difficult to build devices that replicate the fluid, precise motion of humans, but that might change if we could pull a few (literal) strings. At least, that’s the idea behind “cable-driven” mechanisms in which running a string through an object generates streamlined movement across an object’s different parts. Take a robotic finger, for example: You could embed a cable through the palm to the fingertip of this object and then pull it to create a curling motion.

While cable-driven mechanisms can create real-time motion to make an object bend, twist, or fold, they can be complicated and time-consuming to assemble by hand. To automate the process, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed an all-in-one 3D printing approach called “Xstrings.” Part design tool, part fabrication method, Xstrings can embed all the pieces together and produce a cable-driven device, saving time when assembling bionic robots, creating art installations, or working on dynamic fashion designs.

In a paper to be presented at the 2025 Conference on Human Factors in Computing Systems (CHI2025), the researchers used Xstrings to print a range of colorful and unique objects that included a red walking lizard robot, a purple wall sculpture that can open and close like a peacock’s tail, a white tentacle that curls around items, and a white claw that can ball up into a fist to grab objects.

We move thanks to coordination among many skeletal muscle fibers, all twitching and pulling in sync. While some muscles align in one direction, others form intricate patterns, helping parts of the body move in multiple ways.

In recent years, scientists and engineers have looked to muscles as potential actuators for “biohybrid” robots—machines powered by soft, artificially grown . Such bio-bots could squirm and wiggle through spaces where traditional machines cannot. For the most part, however, researchers have only been able to fabricate artificial muscle that pulls in one direction, limiting any robot’s range of motion.

Now MIT engineers have developed a method to grow artificial muscle tissue that twitches and flexes in multiple coordinated directions. As a demonstration, they grew an artificial, muscle-powered structure that pulls both concentrically and radially, much like how the iris in the human eye acts to dilate and constrict the pupil.

The device provides a range of sensations, such as vibrations, pressure, and twisting. A team of engineers led by Northwestern University has developed a new wearable device that stimulates the skin to deliver a range of complex sensations. This thin, flexible device gently adheres to the skin, offering more realistic and immersive sensory experiences. While it is well-suited for gaming and virtual reality (VR), the researchers also see potential applications in healthcare. For instance, the device could help individuals with visual impairments “feel” their surroundings or provide feedback to those with prosthetic limbs.