Toggle light / dark theme

Atoms can absorb and reemit light—this is an everyday phenomenon. In most cases, however, an atom emits a light particle in all possible directions—recapturing this photon is, therefore, quite hard.

A research team from TU Wien in Vienna (Austria) has now been able to demonstrate theoretically that using a special lens, a emitted by one atom can be guaranteed to be reabsorbed by a second atom. This second atom not only absorbs the photon though, but directly returns it back to the first atom. That way, the pass the photon to each other with pinpoint accuracy again and again—just like in ping-pong.

An experiment outlined by a UCL (University College London)-led team of scientists from the UK and India could test whether relatively large masses have a quantum nature, resolving the question of whether quantum mechanical description works at a much larger scale than that of particles and atoms.

Quantum theory is typically seen as describing nature at the tiniest scales, and have not been observed in a laboratory for objects more massive than about a quintillionth of a gram, or more precisely 10-20 g.

The new experiment, described in a paper published in Physical Review Letters and involving researchers at UCL, the University of Southampton, and the Bose Institute in Kolkata, India, could, in principle, test the quantumness of an object regardless of its mass or energy.

In the vast realm of scientific discovery and technological advancement, there exists a hidden frontier that holds the key to unlocking the mysteries of the universe. This frontier is Pico Technology, a domain of measurement and manipulation at the atomic and subatomic levels. The rise of Pico Technology represents a seismic shift in our understanding of precision measurement and its applications across diverse fields, from biology to quantum computing. Pico Technology, at the intersection of precision measurement and quantum effects, represents the forefront of scientific and technological progress, unveiling the remarkable capabilities of working at the picoscale, offering unprecedented precision and reactivity that are reshaping fields ranging from medicine to green energy.

Unlocking the Picoscale World

At the heart of Pico Technology lies the ability to work at the picoscale, a dimension where a picometer, often represented as 1 × 10^−12 meters, reigns supreme. The term ‘pico’ itself is derived from the Greek word ‘pikos’, meaning ‘very small’. What sets Pico Technology apart is not just its capacity to delve deeper into smaller scales, but its unique ability to harness the inherent physical, chemical, mechanical, and optical properties of materials that naturally manifest at the picoscale.

A study led by the University of Oxford has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the ‘reality gap’: the difference between predicted and observed behavior from quantum devices. The results have been published in Physical Review X.

Quantum computing could supercharge a wealth of applications, from climate modeling and financial forecasting, to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual quantum devices (also called qubits). A major barrier against this is inherent variability: where even apparently identical units exhibit different behaviors.

The cause of variability in quantum devices.

Princeton physicists have uncovered a groundbreaking quantum phase transition in superconductivity, challenging established theories and highlighting the need for new approaches to understanding quantum mechanics in solids.

Princeton physicists have discovered an abrupt change in quantum behavior while experimenting with a three-atom.

An atom is the smallest component of an element. It is made up of protons and neutrons within the nucleus, and electrons circling the nucleus.

Heat is the enemy of quantum uncertainty. By arranging light-absorbing molecules in an ordered fashion, physicists in Japan have maintained the critical, yet-to-be-determined state of electron spins for 100 nanoseconds near room temperature.

The innovation could have a profound impact on progress in developing quantum technology that doesn’t rely on the bulky and expensive cooling equipment currently needed to keep particles in a so-called ‘coherent’ form.

Unlike the way we describe objects in our day-to-day living, which have qualities like color, position, speed, and rotation, quantum descriptions of objects involve something less settled. Until their characteristics are locked in place with a quick look, we have to treat objects as if they are smeared over a wide space, spinning in different directions, yet to adopt a simple measurement.

‘This is the first room-temperature quantum coherence of entangled quintets.’

A team of researchers from Kyushu University’s Faculty of Engineering, led by Associate Professor Nobuhiro Yanai, has shattered barriers by achieving quantum coherence at room temperature.


Researchers show room-temperature quantum coherence by observing the entangled quintet state with four electron spins in molecular systems.

This article introduces new approaches to develop early fault-tolerant quantum computing (early-FTQC) such as improving efficiency of quantum computation on encoded data, new circuit efficiency techniques for quantum algorithms, and combining error-mitigation techniques with fault-tolerant quantum computation.

Yuuki Tokunaga NTT Computer and Data Science Laboratories.

Noisy intermediate-scale quantum (NISQ) computers, which do not execute quantum error correction, do not require overhead for encoding. However, because errors inevitably accumulate, there is a limit to computation size. Fault-tolerant quantum computers (FTQCs) carry out computation on encoded qubits, so they have overhead for the encoding and require quantum computers of at least a certain size. The gap between NISQ computers and FTQCs due to the amount of overhead is shown in Fig. 1. Is this gap unavoidable? Decades ago, many researchers would consider the answer to be in the negative. However, our team has recently demonstrated a new, unprecedented method to overcome this gap. Motivation to overcome this gap has also led to a research trend that started at around the same time worldwide. These efforts, collectively called early fault-tolerant quantum computing “early-FTQC”, have become a worldwide research movement.

Japanese chip maker Rohm is collaborating with venture company Quanmatic to improve electrical die sorting (EDS) in what appears to be the first use of quantum computing to optimize a commercial-scale manufacturing process on semiconductor production lines.

After a year of effort, the two companies have announced that full-scale implementation of the probe test technology can begin in April in Rohm’s factories in Japan and overseas. Testing and validation of the prototype indicate that EDS performance can be improved by several percentage points, improving significantly productivity and profitability.

Headquartered in Kyoto, Rohm produces integrated circuits (ICs), discrete semiconductors and other electronic components. It is one of the world’s leading suppliers of silicon carbide wafers and power management devices used in electric vehicles (EVs) and various industrial applications.