Toggle light / dark theme

The rise of quantum computing is more than a technological advancement; it marks a profound shift in the world of cybersecurity, especially when considering the actions of state-sponsored cyber actors. Quantum technology has the power to upend the very foundations of digital security, promising to dismantle current encryption standards, enhance offensive capabilities, and recalibrate the balance of cyber power globally. As leading nations like China, Russia, and others intensify their investments in quantum research, the potential repercussions for cybersecurity and international relations are becoming alarmingly clear.

Imagine a world where encrypted communications, long thought to be secure, could be broken in mere seconds. Today, encryption standards such as RSA or ECC rely on complex mathematical problems that would take traditional computers thousands of years to solve. Quantum computing, however, changes this equation. Using quantum algorithms like Shor’s, a sufficiently powerful quantum computer could factorize these massive numbers, effectively rendering these encryption methods obsolete.

This capability could give state actors the ability to decrypt communications, access sensitive governmental data, and breach secure systems in real time, transforming cyber espionage. Instead of months spent infiltrating networks and monitoring data flow, quantum computing could provide immediate access to critical information, bypassing traditional defenses entirely.

Strong interactions between subatomic particles like electrons occur when they are at a specific energy level known as the van Hove singularity. These interactions give rise to unusual properties in quantum materials, such as superconductivity at high temperatures, potentially ushering in exciting technologies of tomorrow.

Research suggests that allow electrons to flow only on their surface to be promising . However, the quantum properties of these materials remain relatively unexplored.

A study co-led by Nanyang Asst Prof Chang Guoqing of NTU’s School of Physical and Mathematical Sciences identified two types of van Hove singularities in the topological materials rhodium monosilicide (RhSi) and cobalt monosilicide (CoSi).

Achieving the full potential of quantum computing will require the development of quantum gates—circuits that carry out fundamental operations—with much higher fidelity than is currently available. An average gate fidelity surpassing 99.9%, for example, would enable not only efficient fault-tolerant quantum computing with error correction but also effective mitigation of errors in current noisy intermediate-scale quantum devices. In this work, we report on a two-qubit gate that achieves that milestone and sustains it for 12 h.

Superconducting qubits, with their ease of scalability and controllability, are prime candidates for building quantum processors. One type known as a transmon is renowned for its high coherence and ease of manufacturing and is thus already widely embraced in academia and industry. In general, single-qubit gates need negligible coupling between two transmon qubits, whereas two-qubit gates require a large coupling. This necessitates a coupling mechanism that can be tuned to both nearly zero and a very large value.

Various coupling schemes based on transmons have been shown to address this issue. Our work focuses on an innovative coupler known as the double-transmon coupler (DTC), which has been only theoretically proposed. We report the first experimental realization of the DTC, achieving gate fidelities of 99.9% for two-qubit gates and 99.98% for single-qubit gates, demonstrated by using two transmons coupled by the DTC.

Unitary collapse of Schrödinger’s cat state https://journals.aps.org/pra/abstract/10.1103/PhysRevA.110.L030202

Schrödinger’s cat, that iconic thought experiment where a cat in a box is both alive and dead until someone peeks.


The authors study a system composed of a single qubit coupled to a soft-mode quantum oscillator. They show that spontaneous unitary evolution of this system create a Schr\ odinger-cat-like state of the oscillator, which is subsequently lost in a sudden process strongly resembling the measurement-induced collapse of wave function.

The identification of a new type of symmetry in statistical mechanics could help scientists derive and interpret fundamental relationships in this branch of physics.

Symmetry is a foundational concept in physics, describing properties that remain unchanged under transformations such as rotation and translation. Recognizing these invariances, whether intuitively or through complex mathematics, has been pivotal in developing classical mechanics, the theory of relativity, and quantum mechanics. For example, the celebrated standard model of particle physics is built on such symmetry principles. Now Matthias Schmidt and colleagues at the University of Bayreuth, Germany, have identified a new type of invariance in statistical mechanics (the theoretical framework that connects the collective behavior of particles to their microscopic interactions) [1]. With this discovery, the researchers offer a unifying perspective on subtle relationships between observable properties and provide a general approach for deriving new relations.

The concept of conserved, or time-invariant, properties has roots in ancient philosophy and was crucial to the rise of modern science in the 17th century. Energy conservation became a cornerstone of thermodynamics in the 19th century, when engineers uncovered the link between heat and work. Another important type of invariance is Galilean invariance, which states that the laws of physics are identical in all reference frames moving at a constant velocity relative to each other, resulting in specific relations between positions and velocities in different frames. Its extension, Lorentz invariance, posits that the speed of light is independent of the reference frame. Einstein’s special relativity is based on Lorentz invariance, while his general relativity broadens the idea to all coordinate transformations. These final examples illustrate that invariance not only provides relations between physical observables but can shape our understanding of space, time, and other basic concepts.

Researchers at Rice University have found a new way to improve a key element of thermophotovoltaic (TPV) systems, which convert heat into electricity via light. Using an unconventional approach inspired by quantum physics, Rice engineer Gururaj Naik and his team have designed a thermal emitter that can deliver high efficiencies within practical design parameters.

The research could inform the development of thermal-energy electrical storage, which holds promise as an affordable, grid-scale alternative to batteries. More broadly, efficient TPV technologies could facilitate renewable energy growth—an essential component of the transition to a net-zero world. Another major benefit of better TPV systems is recouping from industrial processes, making them more sustainable. To put this in context, up to 20–50% of the heat used to transform raw materials into consumer goods ends up being wasted, costing the United States economy over $200 billion annually.

TPV systems involve two main components: photovoltaic (PV) cells that convert light into electricity and thermal emitters that turn heat into light. Both of these components have to work well in order for the system to be efficient, but efforts to optimize them have focused more on the PV cell.

In an era where AI and data are driving the scientific revolution, quantum computing technology is emerging as another game-changer in the development of new drugs and new materials.

Dr. Hyang-Tag Lim’s research team at the Center for Quantum Technology at the Korea Institute of Science and Technology (KIST) has implemented a quantum computing algorithm that can estimate interatomic bond distances and ground state energies with chemical accuracy using fewer resources than conventional methods, and has succeeded in performing accurate calculations without the need for additional quantum error mitigation techniques.

The work is published in the journal Science Advances.

A research team led by Dominik Schneble, Ph.D., Professor in the Department of Physics and Astronomy, has uncovered a novel regime, or set of conditions within a system, for cooperative radiative phenomena, casting new light on a 70-year-old problem in quantum optics.

Their findings on previously unseen collective spontaneous emission effects, in an array of synthetic (artificial) atoms, are published in Nature Physics, accompanied by a theoretical paper in Physical Review Research.

Spontaneous emission is a phenomenon in which an excited atom falls to a lower-energy state and spontaneously emits a quantum of electromagnetic radiation in the form of a single . When a single excited atom decays and emits a photon, the probability of finding the atom in its falls exponentially to zero as time progresses.

DGIST and UNIST researchers have discovered a new quantum state, the exciton-Floquet synthesis state, enabling real-time quantum information control in two-dimensional semiconductors.

A research team led by Professor Jaedong Lee from the Department of Chemical Physics at DGIST (President Kunwoo Lee) has unveiled a groundbreaking quantum state and an innovative mechanism for extracting and manipulating quantum information through exciton and Floquet states.

Collaborating with Professor Noejung Park from UNIST’s Department of Physics (President Chongrae Park), the team has, for the first time, demonstrated the formation and synthesis process of exciton and Floquet states, which arise from light-matter interactions in two-dimensional semiconductors. This study captures quantum information in real-time as it unfolds through entanglement, offering valuable insights into the exciton formation process in these materials, thereby advancing quantum information technology.

AlphaQubit: an AI-based system that can more accurately identify errors inside quantum computers.


AlphaQubit is a neural-network based decoder drawing on Transformers, a deep learning architecture developed at Google that underpins many of today’s large language models. Using the consistency checks as an input, its task is to correctly predict whether the logical qubit — when measured at the end of the experiment — has flipped from how it was prepared.

We began by training our model to decode the data from a set of 49 qubits inside a Sycamore quantum processor, the central computational unit of the quantum computer. To teach AlphaQubit the general decoding problem, we used a quantum simulator to generate hundreds of millions of examples across a variety of settings and error levels. Then we finetuned AlphaQubit for a specific decoding task by giving it thousands of experimental samples from a particular Sycamore processor.

When tested on new Sycamore data, AlphaQubit set a new standard for accuracy when compared with the previous leading decoders. In the largest Sycamore experiments, AlphaQubit makes 6% fewer errors than tensor network methods, which are highly accurate but impractically slow. AlphaQubit also makes 30% fewer errors than correlated matching, an accurate decoder that is fast enough to scale.