Toggle light / dark theme

Brain cells learn faster than machine learning, research reveals

Researchers have demonstrated that brain cells learn faster and carry out complex networking more effectively than machine learning by comparing how both a Synthetic Biological Intelligence (SBI) system known as “DishBrain” and state-of-the-art RL (reinforcement learning) algorithms react to certain stimuli.

The study, “Dynamic Network Plasticity and Sample Efficiency in Biological Neural Cultures: A Comparative Study with Deep Reinforcement Learning,” published in Cyborg and Bionic Systems, is the first known of its kind.

The research was led by Cortical Labs, the Melbourne-based startup which created the world’s first commercial biological computer, the CL1. The CL1, through which the research was conducted, fuses lab-cultivated neurons from human stem cells with hard silicon to create a more advanced and sustainable form of AI, known as SBI.

AI-Engineered Hydrogels Achieve Instant and Powerful Underwater Adhesion

Underwater adhesives have long posed a challenge to materials scientists, with few solutions capable of delivering instant, strong, and repeatable adhesion in challenging marine and biomedical environments. Now, a team of researchers has leveraged machine learning (ML) and data mining (DM) —inspired by natural adhesive proteins—to engineer next-generation super-adhesive hydrogels that work instantly underwater.

Published in Nature, the study introduces an end-to-end data-driven framework that starts with protein sequence extraction and ends with a scalable hydrogel synthesis method. The results are materials that can seal high-pressure leaks, attach securely to rough, wet surfaces, and even function in living tissue.

Robots learn human-like movement adjustments to prevent object slipping

To effectively tackle a variety of real-world tasks, robots should be able to reliably grasp objects of different shapes, textures and sizes, without dropping them in undesired locations. Conventional approaches to enhancing the ability of robots to grasp objects work by tightening the grip of a robotic hand to prevent objects from slipping.

Researchers at the University of Lincoln, Toshiba Europe’s Cambridge Research Laboratory, the University of Surrey, Arizona State University and KAIST recently introduced alternative computational strategies for preventing the slip of objects grasped by a robotic hand, which works by modulating the trajectories that a robotic hand follows while performing manipulative movements. Their approach, consisting of a robotic controller and a new bio-inspired predictive trajectory modulation strategy, was presented in a paper published in Nature Machine Intelligence.

“The inspiration for this paper came from a very human experience,” Amir Ghalamzan, senior author of the paper, told Tech Xplore.

Routine AI assistance may lead to loss of skills in health professionals who perform colonoscopies

The introduction of artificial intelligence (AI) to assist colonoscopies is linked to a reduction in the ability of endoscopists (health professionals who perform colonoscopies) to detect precancerous growths (adenomas) in the colon without AI assistance, according to a paper published in The Lancet Gastroenterology & Hepatology.

Colonoscopy enables detection and removal of adenomas, leading to prevention of bowel cancer. Numerous trials have shown the use of AI to assist colonoscopies increases the detection of adenomas, generating much enthusiasm for the technology. However, there is a lack of research into how continuous use of AI affects endoscopist skills, with suggestions it could be either positive, by training clinicians, or negative, leading to a reduction in skills.

Author Dr. Marcin Romańczyk, Academy of Silesia (Poland), says, To our knowledge, this is the first study to suggest a negative impact of regular AI use on health care professionals’ ability to complete a patient-relevant task in medicine of any kind.

Tiny robots use sound to self-organize into intelligent groups

Animals like bats, whales and insects have long used acoustic signals for communication and navigation. Now, an international team of scientists has taken a page from nature’s playbook to model micro-sized robots that use sound waves to coordinate into large swarms that exhibit intelligent-like behavior.

The robot groups could one day carry out complex tasks like exploring disaster zones, cleaning up pollution, or performing from inside the body, according to team lead Igor Aronson, Huck Chair Professor of Biomedical Engineering, Chemistry, and Mathematics at Penn State.

“Picture swarms of bees or midges,” Aronson said. “They move, that creates sound, and the sound keeps them cohesive, many individuals acting as one.”

CrowdStrike, Uber, Zoom Among Industry Pioneers Building Smarter Agents With NVIDIA Nemotron and Cosmos Reasoning Models for Enterprise and Physical AI Applications

As enterprises develop AI agents to tackle complex, multistep tasks, models that can provide strong reasoning accuracy with efficient token generation enable intelligent, autonomous decision-making at scale.

NVIDIA Nemotron is a family of advanced open reasoning models that use leading models, NVIDIA-curated open datasets and advanced AI techniques to provide an accurate and efficient starting point for AI agents.

Soft robots go right to the site of kidney stones

An international research team led by the University of Waterloo is developing technology to dissolve painful kidney stones in the urinary tract using tiny robots. The research is published in the journal Advanced Healthcare Materials.

The new technique, tested in a life-size, 3D-printed model, features thin, spaghetti-like strips fitted with magnets that can be moved into place near uric acid with a operated by doctors.

The soft, flexible robot strips are about a centimeter long and contain an enzyme called urease. Once in place, the urease reduces the acidity of the surrounding urine, thereby dissolving stones until they are small enough to pass naturally in just a few days.

/* */