Toggle light / dark theme

Biomimetic multimodal tactile sensing enables human-like robotic perception

Robots That Feel: A New Multimodal Touch System Closes the Gap with Human Perception.

In a major advance for robotic sensing, researchers have engineered a biomimetic tactile system that brings robots closer than ever to human-like touch. Unlike traditional tactile sensors that detect only force or pressure, this new platform integrates multiple sensing modalities into a single ultra-thin skin and combines it with large-scale AI for data interpretation.

At the heart of the system is SuperTac, a 1-millimeter-thick multimodal tactile layer inspired by the multispectral structure of pigeon vision. SuperTac compresses several physical sensing modalities — including multispectral optical imaging (from ultraviolet to mid-infrared), triboelectric contact sensing, and inertial measurements — into a compact, flexible skin. This enables simultaneous detection of force, contact position, texture, material, temperature, proximity and vibration with micrometer-level spatial precision. The sensor achieves better than 94% accuracy in classifying complex tactile features such as texture, material type, and slip dynamics.

However, the hardware alone isn’t enough: rich, multimodal tactile data need interpretation. To address this, the team developed DOVE, an 8.5-billion-parameter tactile language model that functions as a computational interpreter of touch. By learning patterns in the high-dimensional sensor outputs, DOVE provides semantic understanding of tactile interactions — a form of “touch reasoning” that goes beyond raw signal acquisition.

From a neurotech-inspired perspective, this work mirrors principles of biological somatosensation: multiple receptor types working in parallel, dense spatial encoding, and higher-order processing for perceptual meaning. Integrating rich physical sensing with model-based interpretation is akin to how the somatosensory cortex integrates mechanoreceptor inputs into coherent percepts of texture, shape and motion. Such hardware-software co-design — where advanced materials, optics, electronics and AI converge — offers a pathway toward embodied intelligence in machines that feel and interpret touch much like biological organisms do.

Biomimetic multimodal tactile sensing enables human-like robotic perception.


NASA supercomputer just predicted Earth’s hard limit for life

Scientists have used a NASA-grade supercomputer to push our planet to its limits, virtually fast‑forwarding the clock until complex organisms can no longer survive. The result is a hard upper bound on how long Earth can sustain breathable air and liquid oceans, and it is far less about sudden catastrophe than a slow suffocation driven by the Sun itself. The work turns a hazy, far‑future question into a specific timeline for the end of life as we know it.

Instead of fireballs or rogue asteroids, the simulations point to a world that quietly runs out of oxygen, with only hardy microbes clinging on before even they disappear. It is a stark reminder that Earth’s habitability is not permanent, yet it also stretches over such vast spans of time that our immediate crises still depend on choices made this century, not on the Sun’s distant evolution.

The new modeling effort starts from a simple premise: if I know how the Sun brightens over time and how Earth’s atmosphere responds, I can calculate when conditions for complex life finally fail. Researchers fed a high‑performance system with detailed physics of the atmosphere, oceans and carbon cycle, then let it run through hundreds of thousands of scenarios until the planet’s chemistry tipped past a critical point. One study describes a supercomputer simulation that projects life on Earth ending in roughly 1 billion years, once rising solar heat strips away most atmospheric oxygen.

Moonshots with Peter Diamandis

Ray, you’ve made two predictions that I think are important. The first one, as you said, was the one you announced back in 1989: that we would reach human-level AI by 2029. And as you said, people laughed at it.

But there’s another prediction you’ve made: that we will reach the Singularity by 2045. There’s a lot of confusion here. In other words, if we reach human-level AI by 2029 and it then grows exponentially, why do we have to wait until 2045 for the Singularity? Could you explain the difference between these two?

It’s because that’s the point at which our intelligence will become a thousand times greater. One of the ways my view differs from others is that I don’t see it as us having our own intelligence—that is, biological intelligence—while AI exists somewhere else, and we interact with it by comparing human intelligence to AI.


Founder of XPRIZE and pioneer in exponential technologies. Building a world of Abundance through innovation, longevity, and breakthrough ventures.

Tree bark microbes for climate management

In a new Science study, researchers report that bark microbes process methane, hydrogen, and carbon monoxide, showing that bark is an important component of global trace gas dynamics.

Learn more in a new Science Perspective.


Microbes living in bark can process the greenhouse gases methane, hydrogen, and carbon monoxide.

Vincent Gauci Authors Info & Affiliations

Science

Vol 391, Issue 6781

Interpretation, extrapolation and perturbation of single cells

Causal and mechanistic modelling strategies, which aim to infer cause–effect relationships, provide insights into cellular responses to perturbations. The authors review computational approaches that harness machine learning and single-cell data to advance our understanding of cellular heterogeneity and causal mechanisms in biological systems.

Scientists Discover the Body’s Natural “Off Switch” for Inflammation

A human study reveals how naturally occurring fat-derived molecules help switch off inflammation. Researchers at University College London (UCL) have identified an important biological process that helps the body bring inflammation to an end, a finding that may eventually support new treatments for

The regeneration model of aging and its practical implications

Aging is a primary risk factor for multi-morbidity and declining quality of life. The geroscience hypothesis states that targeting biological aging mechanisms may prevent or delay morbidity; however, translating theory into practice remains challenging. Unknown long-term risks and a lack of well-validated, responsive, and practical surrogate endpoints especially hinder the field’s preventive aspirations. This review addresses these obstacles by introducing the regeneration model of aging—a novel framework that integrates biological aging processes and distills the complexity of aging into a series of fundamental steps. The model provides insights into potential trade-offs of anti-aging interventions and can guide strategies to slow aging across diverse populations.

Engineering the Future: John Cumbers on Synthetic Biology and Sustainability

Please Like 👍 + Share ⭐ + Subscribe ✅

In this episode of the New Earth Entrepreneurs podcast, we sit down with John Cumbers, founder of SynBioBeta, to discuss how synthetic biology is reshaping industries and creating sustainable solutions.

John shares insights into the role of bio-manufacturing in decarbonizing supply chains, government initiatives supporting bio-innovation, and the potential for space applications of synthetic biology.

Learn how SynBioBeta is building a passionate community of changemakers to engineer a better, more sustainable world.

Learn more about SynBioBeta and their upcoming events at: www.synbiobeta.com.
Connect with John on LinkedIn: www.linkedin.com/in/john-cumbers-542220

The New Earth Entrepreneurs Podcast explores social entrepreneurship and corporate sustainability through engaging conversations with visionary leaders.

The Intelligence Revolution: Coupling AI and the Human Brain | Ed Boyden | Big Think

The Intelligence Revolution: Coupling AI and the Human Brain.
New videos DAILY: https://bigth.ink.
Join Big Think Edge for exclusive video lessons from top thinkers and doers: https://bigth.ink/Edge.

Edward Boyden is a Hertz Foundation Fellow and recipient of the prestigious Hertz Foundation Grant for graduate study in the applications of the physical, biological and engineering sciences. A professor of Biological Engineering and Brain and Cognitive Sciences at MIT, Edward Boyden explains how humanity is only at its infancy in merging with machines. His work is leading him towards the development of a “brain co-processor”, a device that interacts intimately with the brain to upload and download information to and from it, augmenting human capabilities in memory storage, decision making, and cognition. The first step, however, is understanding the brain on a much deeper level. With the support of the Fannie and John Hertz Foundation, Ed Boyden pursued a PhD in neurosciences from Stanford University.

EDWARD BOYDEN:

Edward Boyden is a professor of Biological Engineering and Brain and Cognitive Sciences at the MIT Media Lab and the McGovern Institute for Brain Research at MIT. He leads the Media Lab’s Synthetic Neurobiology group, which develops tools for analyzing and repairing complex biological systems, such as the brain, and applies them systematically both to reveal ground truth principles of biological function and to repair these systems.

These technologies, often created in interdisciplinary collaborations, include expansion microscopy (which enables complex biological systems to be imaged with nanoscale precision) optogenetic tools (which enable the activation and silencing of neural activity with light,) and optical, nanofabricated, and robotic interfaces (which enable recording and control of neural dynamics).

Boyden has launched an award-winning series of classes at MIT, which teach principles of neuroengineering, starting with the basic principles of how to control and observe neural functions, and culminating with strategies for launching companies in the nascent neurotechnology space. He also co-directs the MIT Center for Neurobiological Engineering, which aims to develop new tools to accelerate neuroscience progress.

/* */