Toggle light / dark theme

This scientist is unlocking the potential of quantum technologies. Here’s how

Chemical biology professor, Suyang Xu, works to crack the secrets of new states of matter.


Throughout human history, most of our efforts to store information, from knots and oracle bones to bamboo markings and the written word, boil down to two techniques: using characters or shapes to represent information. Today, huge amounts of information are stored on silicon wafers with zeros and ones, but a new material at the border of quantum chemistry and quantum physics could enable vast improvements in storage.

Suyang Xu, assistant professor of chemical biology, is tying quantum mechanical “knots” in topological materials, which may be the key to unlocking the potential of quantum technologies to store and process vast arrays of information and bring game-changing advances in a variety of fields.

“Imagine a rope identified by a number of knots,” Xu said. “No matter how much the shape of the rope is changed, the number of knots — known as the topological number — cannot be changed without altering its fundamental identity by adding or undoing knots.” It is this robustness that potentially makes topological materials particularly useful.

Fermilab Says Particle Is Heavy Enough to Break the Standard Model

If the W’s excess heft relative to the standard theoretical prediction can be independently confirmed, the finding would imply the existence of undiscovered particles or forces and would bring about the first major rewriting of the laws of quantum physics in half a century.

“This would be a complete change in how we see the world,” potentially even rivaling the 2012 discovery of the Higgs boson in significance, said Sven Heinemeyer, a physicist at the Institute for Theoretical Physics in Madrid who is not part of CDF. “The Higgs fit well into the previously known picture. This one would be a completely new area to be entered.”

The finding comes at a time when the physics community hungers for flaws in the Standard Model of particle physics, the long-reigning set of equations capturing all known particles and forces. The Standard Model is known to be incomplete, leaving various grand mysteries unsolved, such as the nature of dark matter. The CDF collaboration’s strong track record makes their new result a credible threat to the Standard Model.

ATLAS observes pairs of tau particles in heavy-ion collisions

Today at the Quark Matter 2022 conference, the ATLAS Collaboration announced the observation of tau-lepton pairs created when particles of light – or photons – interact during lead-ion collisions. The result opens a new avenue for measuring how magnetic the tau lepton is – a property sensitive to new particles beyond the Standard Model.

In everyday life, two crossing beams of light follow the rules of classical electrodynamics and do not deflect, absorb or disrupt one another. But, in quantum electrodynamics, things are different. Lead ions accelerated to high energy by the LHC are surrounded by an enormous flux of photons. For a short moment, these photons can interact and transform into a particle–antiparticle pair, such as a pair of tau leptons. These interactions are called ultra-peripheral collisions, which ATLAS physicists used to observe light-by-light scattering in 2019.

Rather than colliding head-on at the centre of the ATLAS detector, the accelerated lead ions pass by each other unscathed. This provides a uniquely clean environment for physicists to study collisions of photons into a pair of tau leptons. Further, the rate of tau-lepton creation scales to the fourth power of the number of protons in the ion, which is 82 for lead.

The side effects of quantum error correction and how to cope with them

It is well established that quantum error correction can improve the performance of quantum sensors. But new theory work cautions that unexpectedly, the approach can also give rise to inaccurate and misleading results—and shows how to rectify these shortcomings.

Quantum systems can interact with one another and with their surroundings in ways that are fundamentally different from those of their classical counterparts. In a quantum sensor, the particularities of these interactions are exploited to obtain characteristic information about the environment of the quantum system—for instance, the strength of a magnetic and electric field in which it is immersed. Crucially, when such a device suitably harnesses the laws of quantum mechanics, then its sensitivity can surpass what is possible, even in principle, with conventional, classical technologies.

Unfortunately, quantum sensors are exquisitely sensitive not only to the physical quantities of interest, but also to noise. One way to suppress these unwanted contributions is to apply schemes collectively known as quantum error correction (QEC). This approach is attracting considerable and increasing attention, as it might enable practical high-precision quantum sensors in a wider range of applications than is possible today. But the benefits of error-corrected quantum sensing come with major potential side effects, as a team led by Florentin Reiter, an Ambizione fellow of the Swiss National Science Foundation working in the group of Jonathan Home at the Institute for Quantum Electronics, has now found. Writing in Physical Review Letters, they report theoretical work in which they show that in realistic settings QEC can distort the output of quantum sensors and might even lead to unphysical results.

Quantum Mereology: Factorizing Hilbert Space into Subsystems with Quasi-Classical Dynamics

We study the question of how to decompose Hilbert space into a preferred tensor-product factorization without any pre-existing structure other than a Hamiltonian operator, in particular the case of a bipartite decomposition into “system” and “environment.” Such a decomposition can be defined by looking for subsystems that exhibit quasi-classical behavior. The correct decomposition is one in which pointer states of the system are relatively robust against environmental monitoring (their entanglement with the environment does not continually and dramatically increase) and remain localized around approximately-classical trajectories. We present an in-principle algorithm for finding such a decomposition by minimizing a combination of entanglement growth and internal spreading of the system. Both of these properties are related to locality in different ways.