Toggle light / dark theme

Microscopic robots that sense, think, act, and compute

Extremely cool paper describing optically programmable ~0.3 mm robots with onboard computation and autonomous locomotion! These tiny rectangular machines carry solar cells, optical receivers, electrokinetic actuators, and more. As demonstrations, the authors programmed them (i) to report local temperature by doing a coded dance and (ii) swim towards warmth before stopping and rotating upon reaching a location with a certain level of heat. This is amazing and I hope such devices are further improved so they can be used in biological applications! Love it!

(https://www.science.org/doi/10.1126/scirobotics.adu8009)


Autonomous submillimeter robots are built with onboard sensing, computation, memory, communication, and locomotion.

Hologram processing method boosts 3D image depth of focus fivefold

Researchers from the University of Tartu Institute of Physics have developed a novel method for enhancing the quality of three-dimensional images by increasing the depth of focus in holograms fivefold after recording, using computational imaging techniques. The technology enables improved performance of 3D holographic microscopy under challenging imaging conditions and facilitates the study of complex biological structures.

The research results were published in the Journal of Physics: Photonics in the article “Axial resolution post-processing engineering in Fresnel incoherent correlation holography.”

One of the main limitations of conventional microscopes and 3D imaging systems is that, once an image or hologram has been recorded, its imaging properties cannot be altered. To overcome this limitation, Shivasubramanian Gopinath, a Junior Research Fellow at the University of Tartu Institute of Physics, and his colleagues have developed a new method that enables to capture a set of holograms with different focal distances at the time of acquisition, instead of a single image. These can then be computationally combined to produce a synthetic hologram that offers a much greater depth of focus than conventional approaches, and allows for post-processing of the recorded image.

The insect-inspired bionic eye that sees, smells and guides robots

The compound eyes of the humble fruit fly are a marvel of nature. They are wide-angle and can process visual information several times faster than the human eye. Inspired by this biological masterpiece, researchers at the Chinese Academy of Sciences have developed an insect-scale compound eye that can both see and smell, potentially improving how drones and robots navigate complex environments and avoid obstacles.

Traditional cameras on robots and drones may excel at capturing high-definition photos, but struggle with a narrow field of view and limited peripheral vision. They also tend to be bulky and power-hungry.

New perspectives on how physical instabilities drive embryonic development

Multicellularity is one of the most profound phenomena in biology, and relies on the ability of a single cell to reorganize itself into a complex organism. It underpins the diversity in the animal kingdom, from insects to frogs, to humans. But how do cells establish and maintain their individuality with such precision? A team led by Jan Brugués at the Cluster of Excellence Physics of Life (PoL) at TUD Dresden University of Technology has uncovered fundamental mechanisms that shed light on this question.

The findings, published in Nature, reveal how cells establish physical boundaries through an inherently unstable process, and how different species have evolved distinct strategies to circumvent this process.

During early development, embryos divide rapidly and with remarkable precision, while reorganizing into many individual units. This requires the cell material (known as cytoplasm) to be partitioned into compartments in a highly orchestrated manner.

Driven electrolytes are agile and active at the nanoscale

Technologies for energy storage as well as biological systems such as the network of neurons in the brain depend on driven electrolytes that are traveling in an electric field due to their electrical charges. This concept has also recently been used to engineer synthetic motors and molecular sensors on the nanoscale or to explain biological processes in nanopores. In this context, the role of the background medium, which is the solvent, and the resulting hydrodynamic fluctuations play an important role. Particles in such a system are influenced by these stochastic fluctuations, which effectively control their movements.

“When we imagine the environment inside a driven electrolyte at the nanoscale, we might think of a calm viscous medium in which ions move due to the electric field and slowly diffuse around. This new study reveals that this picture is wrong: the environment resembles a turbulent sea, which is highly nontrivial given the small scale,” explains Ramin Golestanian, who is director of the Department of Living Matter Physics at MPI-DS, and author of the study published in Physical Review Letters.

The research uncovers how the movement of the ions creates large-scale fluctuating fluid currents that stir up the environment and lead to fast motion of all the particles that are immersed in the environment, even if they are not charged.

Introducing GPT-5.3-Codex-Spark

Codex-Spark is rolling out today as a research preview for ChatGPT Pro users in the latest versions of the Codex app, CLI, and VS Code extension. Because it runs on specialized low-latency hardware, usage is governed by a separate rate limit that may adjust based on demand during the research preview. In addition, we are making Codex-Spark available in the API for a small set of design partners to understand how developers want to integrate Codex-Spark into their products. We’ll expand access over the coming weeks as we continue tuning our integration under real workloads.

Codex-Spark is currently text-only at a 128k context window and is the first in a family of ultra-fast models. As we learn more with the developer community about where fast models shine for coding, we’ll introduce even more capabilities–including larger models, longer context lengths, and multimodal input.

Codex-Spark includes the same safety training as our mainline models, including cyber-relevant training. We evaluated Codex-Spark as part of our standard deployment process, which includes baseline evaluations for cyber and other capabilities, and determined that it does not have a plausible chance of reaching our Preparedness Framework threshold for high capability in cybersecurity or biology.

Space mining without heavy machines? Microbes harvest metals from meteorites aboard space station

If humankind is to explore deep space, one small passenger should not be left behind: microbes. In fact, it would be impossible to leave them behind, since they live on and in our bodies, surfaces and food. Learning how they react to space conditions is critical, but they could also be invaluable fellows in our endeavor to explore space.

Microorganisms such as bacteria and fungi can harvest crucial minerals from rocks and could provide a sustainable alternative to transporting much-needed resources from Earth.

Researchers from Cornell and the University of Edinburgh collaborated to study how those microbes extract platinum group elements from a meteorite in microgravity, with an experiment conducted aboard the International Space Station. They found that “biomining” fungi are particularly adept at extracting the valuable metal palladium, while removing the fungus resulted in a negative effect on nonbiological leaching in microgravity.

SpaceX Starthink: Building Earth’s Planetary Neocortex with Orbital AI

In a bold fusion of SpaceX’s satellite expertise and Tesla’s AI prowess, the Starthink Synthetic Brain emerges as a revolutionary orbital data center.

Proposed in Digital Habitats February 2026 document, this next-gen satellite leverages the Starlink V3 platform to create a distributed synthetic intelligence wrapping the planet.

Following SpaceX’s FCC filing for up to one million orbital data centers and its acquisition of xAI, Starthink signals humanity’s leap toward a Kardashev II civilization.

As Elon Musk noted in February 2026, ]

“In 36 months, but probably closer to 30, the most economically compelling place to put AI will be space.”

## The Biological Analogy.

Starthink draws from neuroscience: * Neural Cluster: A single Tesla AI5 chip, processing AI inference at ~250W, like a neuron group. * Synthetic Brain: One Starthink satellite, a 2.5-tonne self-contained node with 500 neural clusters, solar power, storage, and comms. * Planetary Neocortex: One million interconnected Brains forming a global mesh intelligence, linked by laser and microwave “synapses.”

This Brain Experiment Made People Choose Others Over Themselves

Scientists found that synchronizing activity between two brain regions made people more generous.

A new study suggests that synchronizing activity in specific parts of the brain can make people more likely to act generously. Research published today (February 10) in the open-access journal PLOS Biology reports that stimulating two brain regions in a coordinated way increased altruistic behavior. The study was led by Jie Hu of East China Normal University in China, working with colleagues from the University of Zurich in Switzerland.

Why some people are more altruistic than others.

/* */