Toggle light / dark theme

Together with an international team of researchers from the Universities of Southern California, Central Florida, Pennsylvania State and Saint Louis, physicists from the University of Rostock have developed a novel mechanism to safeguard a key resource in quantum photonics: optical entanglement. Their discovery is published in Science.

Declared as the International Year of Quantum Science and Technology by the United Nations, 2025 marks 100 years since the initial development of quantum mechanics. As this strange and beautiful description of nature on the smallest scales continues to fascinate and puzzle physicists, its quite tangible implications form the basis of modern technology as well as , and are currently in the process of revolutionizing information science and communications.

A key resource to quantum computation is so-called entanglement, which underpins the protocols and algorithms that make quantum computers exponentially more powerful than their classical predecessors. Moreover, entanglement allows for the secure distribution of encryption keys, and entangled photons provide increased sensitivity and noise resilience that dramatically exceed the classical limit.

Researchers have discovered a way to protect quantum information from environmental disruptions, offering hope for more reliable future technologies.

In their study published in Nature Communications, the scientists have shown how certain quantum states can maintain their critical information even when disturbed by . The team includes researchers from the University of the Witwatersrand in Johannesburg, South Africa (Wits University) in collaboration with Huzhou University in China.

“What we’ve found is that topology is a powerful resource for information encoding in the presence of noise,” says Professor Andrew Forbes from the Wits School of Physics.

Qubit-based simulations of gauge theories are challenging as gauge fields require high-dimensional encoding. Now a quantum electrodynamics model has been demonstrated using trapped-ion qudits, which encode information in multiple states of ions.

For the first time, theoretical physicists from the Institute of Theoretical Physics (IPhT) in Paris-Saclay have completely determined the statistics that can be generated by a system using quantum entanglement. This achievement paves the way for exhaustive test procedures for quantum devices.

The study is published in the journal Nature Physics.

After the advent of transistors, lasers and , the entanglement of quantum objects—as varied as photons, electrons and superconducting circuits—is at the heart of a second quantum revolution, with and quantum computing in sight.

Researchers from the University of Science and Technology of China (USTC) of the Chinese Academy of Sciences revealed that not all forms of quantum nonlocality guarantee intrinsic randomness. They demonstrated that violating two-input Bell inequalities is both necessary and sufficient for certifying randomness, but this equivalence breaks down in scenarios involving multiple inputs. The study is published in Physical Review Letters.

Quantum mechanics is inherently probabilistic, and this intrinsic has been leveraged for applications like random number generation. However, ensuring the security of these random numbers in real-world scenarios is challenging due to potential vulnerabilities in the devices used.

Bell nonlocality, where particles exhibit correlations that cannot be explained by classical physics, offers a way to certify randomness without trusting the devices. Previous studies have shown that violating Bell inequalities can certify randomness in simple two-input, two-output systems. However, the applicability of this principle to more complex, multiple-input, multiple-output (MIMO) systems has been unclear.

In a new paper in Nature, a team of researchers from JPMorganChase, Quantinuum, Argonne National Laboratory, Oak Ridge National Laboratory and The University of Texas at Austin describe a milestone in the field of quantum computing, with potential applications in cryptography, fairness and privacy.

Using a 56-qubit quantum computer, they have for the first time experimentally demonstrated certified randomness, a way of generating random numbers from a quantum computer and then using a classical supercomputer to prove they are truly random and freshly generated. This could pave the way toward the use of quantum computers for a practical task unattainable through classical methods.

Scott Aaronson, Schlumberger Centennial Chair of Computer Science and director of the Quantum Information Center at UT Austin, invented the certified randomness protocol that was demonstrated. He and his former postdoctoral researcher, Shih-Han Hung, provided theoretical and analytical support to the experimentalists on this latest project.

For decades, researchers have explored how electrons behave in quantum materials. Under certain conditions, electrons interact strongly with each other instead of moving independently, leading to exotic quantum states. One such state, first proposed by Nobel laureate Eugene Wigner, is the Wigner crystal—a structured electron arrangement caused by their mutual repulsion. Although widely theorized, experimental proof has been rare.

Researchers at Yonsei University have now provided evidence of Wigner crystallization and the associated electronic rotons. In a study published in the journal Nature, Prof. Keun Su Kim and his team used (ARPES) to analyze black phosphorus doped with alkali metals. Their data revealed aperiodic energy variations, a hallmark of electronic rotons.

Crucially, as they decreased the dopant density within the material, the roton energy gap shrank to zero. This observation confirmed a transition from a fluid-like quantum state to a structured electron lattice, characteristic of Wigner crystallization.

Highly charged heavy ions form a very suitable experimental field for investigating quantum electrodynamics (QED), the best-tested theory in physics describing all electrical and magnetic interactions of light and matter. A crucial property of the electron within QED is the so-called g factor, which precisely characterizes how the particle behaves in a magnetic field.

Recently, the ALPHATRAP group led by Sven Sturm in the division of Klaus Blaum at the Max-Planck-Institut für Kernphysik (MPIK) in Heidelberg measured the g factor of hydrogen-like tin ions on a precision level of 0.5 parts per billion, which is like measuring the distance from Cologne to Frankfurt with precision down to the thickness of a human hair. This is a stringent test of QED for the simplest atomic system, just like conventional hydrogen but with a much higher electric field experienced by the electron due to the charge of 50 protons inside the tin nucleus.

In a new study published in Physical Review Letters, researchers have now tackled highly charged boron-like tin ions with only five remaining electrons. The goal is to study the inter-electronic effects in the boron-like configuration. So far, the only boron-like g factor has been measured with high precision for argon ions with a proton number Z of 18. However, the nucleus is not a point charge like the electron and its charge distribution leads to finite nuclear size corrections—another challenge for precision experiments.

Georgia Tech researchers recently proposed a method for generating quantum entanglement between photons. This method constitutes a breakthrough that has potentially transformative consequences for the future of photonics-based quantum computing.

“Our results point to the possibility of building quantum computers using light by taking advantage of this entanglement,” said Chandra Raman, a professor in the School of Physics. The research is published in the journal Physical Review Letters.

Quantum computers have the potential to outperform their conventional counterparts, becoming the fastest programmable machines in existence. Entanglement is the key resource for building these quantum computers.