Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

Computational trick enables better understanding of exotic state of matter

It can be found inside gas giants such as Jupiter and is briefly created during meteorite impacts or in laser fusion experiments: warm dense matter. This exotic state of matter combines features of solid, liquid and gaseous phases. Until now, simulating warm dense matter accurately has been considered a major challenge.

An international team led by researchers from the Center for Advanced Systems Understanding (CASUS) at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) in Germany and Lawrence Livermore National Laboratory (LLNL) has succeeded in describing this state of matter much more accurately than before using a new computational method. The approach could advance and help in the synthesis of new high-tech materials.

The team presents its results in the journal Nature Communications.

A new approach to probing Landauer’s principle in the quantum many-body regime

Landauer’s principle is a thermodynamics concept also relevant in information theory, which states that erasing one bit of information from an information system results in the dissipation of at least a specific amount (i.e., kBTln2) of energy. This principle has so far been primarily considered in the context of classical computers and information processing systems.

Yet researchers at TU Vienna, the Freie Universität Berlin, the University of British Columbia, the University of Crete and the Università di Pavia recently extended Landauer’s principle to quantum many-body systems, systems made up of many interacting .

Their paper, published in Nature Physics, introduces a viable approach to experimentally probe this crucial principle in a quantum regime and test rooted in quantum thermodynamics.

New hybrid quantum–classical computing approach used to study chemical systems

Caltech professor of chemistry Sandeep Sharma and colleagues from IBM and the RIKEN Center for Computational Science in Japan are giving us a glimpse of the future of computing. The team has used quantum computing in combination with classical distributed computing to attack a notably challenging problem in quantum chemistry: determining the electronic energy levels of a relatively complex molecule.

The work demonstrates the promise of such a quantum–classical hybrid approach for advancing not only , but also fields such as , nanotechnology, and drug discovery, where insight into the electronic fingerprint of materials can reveal how they will behave.

“We have shown that you can take classical algorithms that run on high-performance classical computers and combine them with quantum algorithms that run on quantum computers to get useful chemical results,” says Sharma, a new member of the Caltech faculty whose work focuses on developing algorithms to study quantum . “We call this quantum-centric supercomputing.”

Boson sampling finds first practical applications in quantum AI

For over a decade, researchers have considered boson sampling—a quantum computing protocol involving light particles—as a key milestone toward demonstrating the advantages of quantum methods over classical computing. But while previous experiments showed that boson sampling is hard to simulate with classical computers, practical uses have remained out of reach.

Now, in Optica Quantum, researchers from the Okinawa Institute of Science and Technology (OIST) present the first practical application of boson sampling for image recognition, a vital task across many fields, from forensic science to medical diagnostics. Their approach uses just three photons and a linear optical network, marking a significant step towards low energy quantum AI systems.

Tiny collider experiment determines three electrons are enough for strong interactions between particles

Three electrons are enough to trigger strong interactions between particles. That is what was demonstrated by scientists from the CNRS and l’Université de Grenoble Alpes, in collaboration with teams from Germany and Latvia, in a study published in the journal Nature.

With the help of a tiny collider they built themselves, the researchers successfully “accelerated” up to five at the same time toward a separation barrier, and counted the number of electrons present on each side.

The result: Three electrons are enough to show between particles. With five electrons, the interactions become so intense that they imitate the behavior of hundreds of billions of electrons. Placed together, these three particles form an actual “heap” in the .

Physicists recreate forgotten experiment observing fusion

A Los Alamos collaboration has replicated an important but largely forgotten physics experiment: the first deuterium-tritium (DT) fusion observation. As described in the article published in Physical Review C, the reworking of the previously unheralded experiment confirmed the role of University of Michigan physicist Arthur Ruhlig, whose 1938 experiment and observation of deuterium-tritium fusion likely planted the seed for a physics process that informs national security work and nuclear energy research to this day.

“As we’ve uncovered, Ruhlig’s contribution was to hypothesize that DT fusion happens with very high probability when deuterium and tritium are brought sufficiently close together,” said Mark Chadwick, associate Laboratory director for Science, Computation and Theory at Los Alamos. “Replicating his experiment helped us interpret his work and better understand his role, and what proved to be his essentially correct conclusions. The course of nuclear fuel physics has borne out the profound consequences of Arthur Ruhlig’s clever insight.”

The DT fusion reaction is central to enabling fusion technologies, whether as part of the nation’s nuclear deterrence capabilities or in ongoing efforts to develop fusion for civilian energy. For instance, the deuterium-tritium reaction is at the center of efforts at the National Ignition Facility to harness fusion. Los Alamos physicists developed a theory about where the idea came from—Ruhlig—and then built an experiment that would confirm the import and accuracy of Ruhlig’s suggestion.

It’s elementary: Problem-solving AI approach tackles inverse problems used in nuclear physics and beyond

Solving life’s great mysteries often requires detective work, using observed outcomes to determine their cause. For instance, nuclear physicists at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility analyze the aftermath of particle interactions to understand the structure of the atomic nucleus.

This type of subatomic sleuthing is known as the inverse problem. It is the opposite of a forward problem, where causes are used to calculate the effects. Inverse problems arise in many descriptions of physical phenomena, and often their solution is limited by the experimental data available.

That’s why scientists at Jefferson Lab and DOE’s Argonne National Laboratory, as part of the QuantOm Collaboration, have led the development of an artificial intelligence (AI) technique that can reliably solve these types of puzzles on supercomputers at large scales.

Sensitive yet tough photonic devices are now a reality

Engineers at the University of California San Diego have achieved a long-sought milestone in photonics: creating tiny optical devices that are both highly sensitive and durable—two qualities that have long been considered fundamentally incompatible.

This rare coexistence of sensitivity and durability could lead to a new generation of photonic devices that are not only precise and powerful but also much easier and cheaper to produce at scale. This could open the door to advanced sensors and technologies ranging from highly sensitive medical diagnostics and environmental sensors to more secure communication systems, all built into tiny, chip-scale devices.

Achieving both properties has been a challenge because devices that are sensitive enough to detect tiny changes in their environment are often fragile and prone to breaking down if even the smallest imperfections arise during manufacturing. This makes them expensive and difficult to produce at scale. Meanwhile, making such devices more rugged often means compromising their precision.

Smart amplifier cuts power consumption, paving way for more qubits and less decoherence

Quantum computers can solve extraordinarily complex problems, unlocking new possibilities in fields such as drug development, encryption, AI, and logistics. Now, researchers at Chalmers University of Technology in Sweden have developed a highly efficient amplifier that activates only when reading information from qubits. The study was published in the journal IEEE Transactions on Microwave Theory and Techniques.

Thanks to its smart design, it consumes just one-tenth of the power consumed by the best amplifiers available today. This reduces decoherence and lays the foundation for more with significantly more qubits and enhanced performance.

Bits, which are the building blocks of a conventional computer, can only ever have the value of 1 or 0. By contrast, the common building blocks of a quantum computer, quantum bits or qubits, can exist in states having the value 1 and 0 simultaneously, as well as all states in between in any combination.

Revolutionary “Material Maze” Could Prevent Bacterial Infections

Scientists used patterned plastic surfaces to trick bacteria into halting their own spread. These designs may prevent infections without the need for antimicrobial drugs. Scientists at the University of Nottingham have identified surface patterns that significantly reduce the ability of bacteria