Toggle light / dark theme

MIT physicists have developed a technique that allows them to arrange atoms in much closer proximity, down to a mere 50 nanometers.


Proximity is key for many quantum phenomena, as interactions between atoms are stronger when the particles are close. In many quantum simulators, scientists arrange atoms as close together as possible to explore exotic states of matter and build new quantum materials.

Research published in Nature demonstrates high qubit control fidelity and uniformity in single-electron control.

SANTA CLARA, Calif., May 1, 2024 —(BUSINESS WIRE)—Today, Nature published an Intel research paper, “Probing single electrons across 300-mm spin qubit wafers,” demonstrating state-of-the-art uniformity, fidelity and measurement statistics of spin qubits. The industry-leading research opens the door for the mass production and continued scaling of silicon-based quantum processors, all of which are requirements for building a fault-tolerant quantum computer.

Quantum hardware researchers from Intel developed a 300-millimeter cryogenic probing process to collect high-volume data on the performance of spin qubit devices across whole wafers using complementary metal oxide semiconductor (CMOS) manufacturing techniques.

Scientists have adapted a device called a microwave circulator for use in quantum computers, allowing them for the first time to precisely tune the exact degree of nonreciprocity between a qubit, the fundamental unit of quantum computing, and a microwave-resonant cavity. The ability to precisely tune the degree of nonreciprocity is an important tool to have in quantum information processing. In doing so, the team derived a general and widely applicable theory that simplifies and expands upon older understandings of nonreciprocity so that future work on similar topics can take advantage of the team’s model, even when using different components and platforms.

Bartosz Regula from the RIKEN Center for Quantum Computing and Ludovico Lami from the University of Amsterdam have shown, through probabilistic calculations, that there is indeed, as had been hypothesized, a rule of “entropy” for the phenomenon of quantum entanglement. This finding could help drive a better understanding of quantum entanglement, which is a key resource that underlies much of the power of future quantum computers. Little is currently understood about the optimal ways to make an effective use of it, despite it being the focus of research in quantum information science for decades.

The second law of thermodynamics, which says that a system can never move to a state with lower “entropy”, or order, is one of the most fundamental laws of nature, and lies at the very heart of physics. It is what creates the “arrow of time,” and tells us the remarkable fact that the dynamics of general physical systems, even extremely complex ones such as gases or black holes, are encapsulated by a single function, its “entropy.”

There is a complication, however. The principle of entropy is known to apply to all classical systems, but today we are increasingly exploring the quantum world. We are now going through a quantum revolution, and it becomes crucially important to understand how we can extract and transform the expensive and fragile quantum resources.

Quantum mechanics, the most potent theory physicists have developed, doesn’t make sense. What I mean by that statement is that quantum mechanics — which was developed to describe the microworld of molecules, atoms, and subatomic particles — leaves its users without a common-sense picture of what it describes. Full of what seem to be paradoxes and puzzles, quantum physics demands, for most scientists, an interpretation: a way of making sense of its mathematical formalism in terms of a concrete description of what exists in the world and how we interact with it. Unfortunately, after a century not one but a basketful of “quantum interpretations” have been proposed. Which one is correct? Which one most clearly understands what quantum physics has been trying to tell us these past 100 years?

In light of these questions, I’m beginning a series that explores the most radical of all the quantum interpretations, the one I think gets it right, or at least is pointed in the right direction. It is a relative newcomer to the scene, so you may not have heard of it. But it has been gaining a lot of attention recently because it doesn’t just ask us to reimagine how we view the science of atoms; it asks us to reimagine the process of science itself.

The term “QBism” was shorthand for “Quantum Bayesianism” when this idea/theory/interpretation was first proposed in the late 1990s and early 2000s. The name hit the nail on the head because “Bayesianism” is a radical way of interpreting probabilities. The Bayesianist approach to what we mean by probability differs strongly from what you learned in school about coin flips and dice rolls and how frequently a particular result can be expected to appear. Since probabilities lie at the heart of quantum mechanics, QBism zeroed in on a key aspect of quantum formalism — one that other interpretations had missed or swept under the rug — because it focused squarely on how we interpret probabilities. We’re going to dig deep into all of this as we go along in this series, but since today’s column is supposed to be the introduction, let’s start with a 10,000-foot view of what’s at stake in the great “Quantum Interpretation Wars” so we can see where QBism fits in.