Toggle light / dark theme

Microsoft-backed start-up raises $40 million for helium atom beam lithography that could print chips at atomic resolution — 0.1nm beam is 135 times narrower than ASML’s EUV light

Lace Lithography, a Norwegian start-up backed by Microsoft, raised $40 million in Series A funding on Monday to develop a chipmaking tool that uses a helium atom beam instead of light to pattern silicon wafers, Reuters reported. The company claims its technology can create chip features 10 times smaller than current lithography systems, with a beam width of just 0.1 nanometers compared to the 13.5nm wavelength used by ASML’s EUV scanners. Lace aims to have a test tool running in a pilot fab by 2029.

The advantage of Lace’s system is that atoms don’t have a diffraction limit, whereas photon-based lithography, including ASML’s EUV systems, is constrained by the wavelength of the light it uses. As chipmakers push features smaller, they rely on increasingly complex multi-patterning techniques to work around that limit, but Lace sidesteps the problem entirely by replacing photons with neutral helium atoms and a beam measuring roughly the width of a single hydrogen atom.

Teleportation is no longer just science fiction—at the quantum level

(Science fiction’s “warp drive” is speeding closer to reality.)

Inspired by science fiction, they landed on “quantum teleportation.” Since then, the idea has gone from theoretical concept to an experimentally verified reality. The first experiments in the late 1990s showed that quantum states could be transmitted across short distances, while subsequent research proved it works across increasingly longer distances—even to and from low Earth orbit, as Chinese scientists demonstrated in 2017. They’ve achieved quantum teleportation by taking advantage of quantum entanglement, a natural phenomenon in which tiny particles can become linked with each other across infinite distances.

Quantum teleportation is very different from the teleportation of matter we see in fiction. It involves transferring a quantum state without moving any matter. And while experts say it won’t lead to Star Trek-esque beaming, it could help bring about a new era of computing that revolutionizes our understanding of the subatomic world—and by extension, of the nature of the universe and everything within it.

New computational biology for genome sequencing analysis

To improve the ability of metapipeline-DNA to determine where changes in the genome have occurred, the scientists worked with the Genome in a Bottle Consortium led by the U.S. Department of Commerce’s National Institute of Standards and Technology. By incorporating this public-private-academic consortium’s meticulously validated resources, the researchers reduced the rate of false positives without reducing the tool’s precision in finding true genetic variants.

The researchers also produced two case studies demonstrating the pipeline’s capabilities for cancer research. The investigators used metapipeline-DNA to analyze sequencing data from five patients that donated both normal tissue and tumor samples, as well as another five from The Cancer Genome Atlas.

The next step is to get metapipeline-DNA into more labs to accelerate discoveries, and to continue improving the resource with more user feedback. ScienceMission sciencenewshighlights.


In a single experiment, scientists can decipher the entire genomes of many patient samples, animal models or cultured cells. To fully realize the potential to study biology at this unprecedented scale, researchers must be equipped to analyze the titanic troves of data generated by these new methods.

Scientists published findings in Cell Reports Methods discussing building and testing a new computational tool for tackling massive and complex sequencing datasets. The new resource, named metapipeline-DNA, may also make sequencing data analysis more standardized across different research labs.

The sequence of a single human genome represents about 100 gigabytes of raw data, the rough equivalent of 20,000 smartphone photos. The sheer scale of experimental data increases significantly as tens or hundreds of genomes are added into the mix.

Ultrastructural preservation of a whole large mammal brain with a protocol compatible with human physician-assisted death

Ultrastructural Preservation of a Whole Large Mammal Brain (bioRxiv, 2026) ⚠️ Preprint – not yet peer-reviewed.

A 2026 preprint builds on over a decade of brain preservation research, demonstrating that whole mammalian brains (pigs) can be preserved with remarkable structural fidelity under near–real-world, end-of-life conditions.

The study refines aldehyde-stabilized cryopreservation (ASC)—a technique previously recognized by the Brain Preservation Foundation. This method combines chemical fixation (aldehydes), cryoprotectants, and controlled cooling to prevent ice damage and preserve neural structure at the nanoscale. — What the study shows.

Whole pig brains preserved with intact cellular and synaptic architecture.

Preservation remains viable even with delayed postmortem intervals (~10 minutes)

Tissue remains perfusable and structurally stable after fixation.

Protocol moves toward clinically realistic implementation, not just lab conditions.

Physicists just turned glass into a powerful quantum security device

Scientists have turned simple glass into a powerful quantum communication device that could safeguard data against future quantum attacks. The chip combines stability, speed, and versatility—handling both ultra-secure encryption and record-breaking random number generation in one compact system.

Electric current stabilizes spins at unstable points for new types of computing

A research team has discovered a new way to control tiny magnetic properties inside materials using electric current, which could possibly pave the way for new types of computing technologies. The work is based on spintronics, a field that uses not only the electric charge of electrons but also their “spin,” a quantum property that can be thought of as a tiny magnet.

Spintronics is already used in magnetic random access memory (MRAM), a type of memory that keeps data even when the power is turned off. This is different from conventional memory, which loses information without electricity.

In MRAM, data is stored depending on whether spins point “up” or “down.” These two stable states are separated by an energy barrier, which helps keep the data secure. However, this stability also makes it harder to switch between states, requiring strong electric currents.

A Hall ‘rectenna’ can detect signals over a 100 GHz frequency range

Many current wireless communication, imaging and sensing technologies rely on components that convert oscillating electric and magnetic fields (i.e., electromagnetic waves) into electrical signals. Some of the most used components are so-called p-n diodes, semiconducting devices that combine two types of materials with distinct electrical properties.

In conventional diode designs, the conversion of electromagnetic waves into electrical signals relies on the nonlinear transport of electrons. This means that the electric current in the devices does not change proportionally with the voltage applied, which allows them to rectify signals (i.e., convert alternating current into direct current) and combine signals with different frequencies.

A key limitation of traditional diodes is that thermal effects introduce noise, causing electrons to move randomly and making weak signals harder to detect. Moreover, electrons typically take a finite time to travel across the device, also known as the transit time, which limits the performance of the diodes at very high frequencies.

Superconducting quantum processor performs well with significantly less wiring

Quantum computers, computing systems that process information using quantum mechanical effects, could outperform classical computers on some computational tasks. These computers rely on qubits, the basic units of quantum information, which can exist in multiple states (0, 1 or both simultaneously), due to quantum effects known as superposition and entanglement.

Many of the quantum computers developed in recent years are based on conventional superconductors, materials that exhibit an electrical resistance of zero at extremely low temperatures. To operate reliably and exhibit superconductivity, circuits based on these materials need to be cooled down to millikelvin temperatures.

In quantum computers, each qubit typically requires its own control line. This means that engineers need to introduce several wires that carry electrical pulses (i.e., signal lines), and the number of necessary wires increases with the number of qubits. As quantum computers grow larger, this can be problematic, as processors become harder to build and reliably operate.

Quantum computers could have a fundamental limit after all

The performance of quantum computers could cap out after around 1,000 qubits, according to a new analysis published in the Proceedings of the National Academy of Sciences. Through new calculations, Tim Palmer at the University of Oxford has reconsidered the mathematical foundations underlying the quantum principles behind the technology, concluding that restrictions on the information-carrying capacity of large quantum systems could make their computing power far more limited than many researchers predict.

For some time, quantum physicists have been growing increasingly excited—and concerned—about the seemingly limitless potential of quantum computers. In a classical computer, information content generally grows linearly as the number of bits increases. But in a quantum computer, each extra qubit doubles the number of quantum states the system can occupy.

Since these states can encode multiple possibilities at the same time, the overall system appears to become exponentially more powerful with each added qubit—at least according to our current understanding of quantum mechanics.

/* */