A new method for checking the reliability of a quantum computer can be scaled up to devices of any size.
When two black holes merge or two neutron stars collide, gravitational waves can be generated. They spread at the speed of light and cause tiny distortions in space-time. Albert Einstein predicted their existence, and the first direct experimental observation dates from 2015.
Now, Prof. Ralf Schützhold, theoretical physicist at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), is going one step further. He has conceived an experiment through which gravitational waves can not only be observed but even manipulated. Published in the journal Physical Review Letters, the idea could also deliver new insights into the hitherto only conjectured quantum nature of gravity.
“Gravity affects everything, including light,” says Schützhold. And this interaction also occurs when gravitational waves and light waves meet.
At the heart of every camera is a sensor, whether that sensor is a collection of light-detecting pixels or a strip of 35-millimeter film. But what happens when you want to take a picture of something so small that the sensor itself has to shrink down to sizes that cause the sensor’s performance to crater?
Now, Northeastern University researchers have made a breakthrough discovery in sensing technologies that allows them to detect objects as small as individual proteins or single cancer cells, without the additional need to scale down the sensor. Their breakthrough uses guided acoustic waves and specialized states of matter to achieve great precision within very small parameters.
The device, which is about the size of a belt buckle, opens up possibilities for sensing at both the nano and quantum scales, with repercussions for everything from quantum computing to precision medicine.
Jim Al-Khalili explores emerging technologies powering the future of quantum, and looks at how we got here.
This Discourse was recorded at the Ri on 7 November 2025, in partnership with the Institute of Physics.
Watch the Q&A session for this talk here (exclusively for our Science Supporter members):
Join this channel as a member to get access to perks:
/ @theroyalinstitution.
Physicist and renowned broadcaster Jim Al-Khalili takes a look back at a century of quantum mechanics, the strangest yet most successful theory in all of science, and how it has shaped our world. He also looks forward to the exciting new world of Quantum 2.0 and how a deeper understanding of such counterintuitive concepts as quantum superposition and quantum entanglement is leading to the development of entirely new technologies, from quantum computers and quantum sensors to quantum cryptography and the quantum internet.
The United Nations has proclaimed 2025 as the International Year of Quantum Science and Technology, to celebrate the centenary of quantum mechanics and the revolutionary work of the likes of Werner Heisenberg and Erwin Schrödinger. Together with the Institute of Physics, join us to celebrate the culmination of the International Year of Quantum at the penultimate Discourse of our Discover200 year.
-
Researchers in the US have used quantum chemistry to explain why ozone catalysts degrade during water electrolysis.
Quantum technologies are systems that leverage quantum mechanical effects to perform computations, share information or perform other functions. These systems rely on quantum states, which need to be reliably transferred and protected against decoherence (i.e., a gradual loss of quantum information).
In recent years, quantum physicists and engineers have introduced so-called giant atoms, artificial structures that behave like enlarged atoms and could be used to develop quantum technologies. In a recent paper published in Physical Review Letters, researchers at Chalmers University of Technology built on this concept and introduced new carefully engineered giant ‘superatoms’ (GSAs), a new type of giant-atom-like structures that could generate entanglement and enable the reliable transfer of quantum states between different such devices.
“Over the past years, there has been growing interest in so-called ‘giant atoms,’ which are quantum emitters that couple to their environment at multiple, spatially separated points,” Lei Du, first author of the paper, told Phys.org.
A new computational approach developed at the University of Chicago promises to shed light on some of the world’s most puzzling materials—from high-temperature superconductors to solar cell semiconductors—by uniting two long-divided scientific perspectives.
“For decades, chemists and physicists have used very different lenses to look at materials. What we’ve done now is create a rigorous way to bring those perspectives together,” said senior author Laura Gagliardi, Richard and Kathy Leventhal Professor in the Department of Chemistry and the Pritzker School of Molecular Engineering. “This gives us a new toolkit to understand and eventually design materials with extraordinary properties.”
When it comes to solids, physicists usually think in terms of broad, repeating band structures, while chemists focus on the local behavior of electrons in specific molecules or fragments. But many important materials—such as organic semiconductors, metal–organic frameworks, and strongly correlated oxides—don’t fit neatly into either picture. In these materials, electrons are often thought of as hopping between repeating fragments rather than being distributed across the material.
University of Iowa scientists have identified a new way to “purify” photons, a development that could improve both the efficiency and security of optical quantum technologies.
The team focused on two persistent problems that stand in the way of producing a reliable stream of single photons, which are essential for photonic quantum computers and secure communication systems. The first issue, known as laser scatter, arises when a laser is aimed at an atom to trigger the release of a photon, the basic unit of light. Although this method successfully generates photons, it can also produce extra, unwanted ones. These additional photons reduce the efficiency of the optical system, similar to how stray electrical currents interfere with electronic circuits.
A second complication comes from the way atoms occasionally respond to laser light. In uncommon cases, an atom releases more than one photon at the same time. When this happens, the precision of the optical circuit suffers because the extra photons disrupt the intended orderly flow of single photons.
Science has a rich tradition of physics by imagination. From the 16th century, scientists and philosophers have conjured ‘demons’ that test the limits of our strongest theories of reality.
Three stand out today: Laplace’s demon, capable of perfectly predicting the future; Loschmidt’s demon, which could reverse time and violate the second law of thermodynamics; and Maxwell’s demon, which create a working heat engine at no cost.
Though imaginary, these paradoxical beings have pushed physicists towards sharper theories. From quantum theory to thermodynamics, these demons have legacies that we still feel today.
Image: Antonio Sortino
Three thought experiments involving “demons” have haunted physics for centuries. What should we make of them today?
For a long time, this has been a major hurdle in optics. Light is an incredible tool for fast, efficient communication and futuristic quantum computers, but it’s notoriously hard to control at such delicate, “single-photon” levels.
Electron avalanche multiplication can enable an all-optical modulator controlled by single photons.