Toggle light / dark theme

Highly charged heavy ions form a very suitable experimental field for investigating quantum electrodynamics (QED), the best-tested theory in physics describing all electrical and magnetic interactions of light and matter. A crucial property of the electron within QED is the so-called g factor, which precisely characterizes how the particle behaves in a magnetic field.

Recently, the ALPHATRAP group led by Sven Sturm in the division of Klaus Blaum at the Max-Planck-Institut für Kernphysik (MPIK) in Heidelberg measured the g factor of hydrogen-like tin ions on a precision level of 0.5 parts per billion, which is like measuring the distance from Cologne to Frankfurt with precision down to the thickness of a human hair. This is a stringent test of QED for the simplest atomic system, just like conventional hydrogen but with a much higher electric field experienced by the electron due to the charge of 50 protons inside the tin nucleus.

In a new study published in Physical Review Letters, researchers have now tackled highly charged boron-like tin ions with only five remaining electrons. The goal is to study the inter-electronic effects in the boron-like configuration. So far, the only boron-like g factor has been measured with high precision for argon ions with a proton number Z of 18. However, the nucleus is not a point charge like the electron and its charge distribution leads to finite nuclear size corrections—another challenge for precision experiments.

Georgia Tech researchers recently proposed a method for generating quantum entanglement between photons. This method constitutes a breakthrough that has potentially transformative consequences for the future of photonics-based quantum computing.

“Our results point to the possibility of building quantum computers using light by taking advantage of this entanglement,” said Chandra Raman, a professor in the School of Physics. The research is published in the journal Physical Review Letters.

Quantum computers have the potential to outperform their conventional counterparts, becoming the fastest programmable machines in existence. Entanglement is the key resource for building these quantum computers.

On March 24, at the annual Rencontres de Moriond conference taking place in La Thuile, Italy, the LHCb collaboration at CERN reported a new milestone in our understanding of the subtle yet profound differences between matter and antimatter.

In its analysis of large quantities of data produced by the Large Hadron Collider (LHC), the international team found overwhelming evidence that particles known as baryons, such as the protons and neutrons that make up , are subject to a mirror-like asymmetry in nature’s fundamental laws that causes matter and antimatter to behave differently.

The discovery provides new ways to address why the that make up matter fall into the neat patterns described by the Standard Model of particle physics, and to explore why matter apparently prevailed over antimatter after the Big Bang. The paper is available on the arXiv preprint server.

Similar to humans going on journeys of self-discovery, quantum computers are also capable of deepening their understanding of their own foundations.

Researchers from Tohoku University and St. Paul’s School, London, have developed a that allows quantum computers to analyze and protect quantum entanglement—a fundamental underpinning of quantum computing. These findings will advance our understanding of quantum entanglement and quantum technologies.

The study was published in Physical Review Letters on March 4, 2025.

At hypersonic speeds, complexities occur when the gases interact with the surface of the vehicle, such as boundary layers and shock waves. Researchers in the Department of Aerospace Engineering at The Grainger College of Engineering, University of Illinois Urbana-Champaign, were able to observe new disturbances in simulations conducted for the first time in 3D.

The study, “Loss of axial symmetry in hypersonic flows over conical shapes,” is published in Physical Review Fluids.

Fully 3D simulations require a great deal of processing power, making the work expensive to compute. Two things made it possible for Deborah Levin and her Ph.D. student Irmak Taylan Karpuzcu to conduct the research: Time on Frontera, the leadership-class computer system at the Texas Advanced Computing Center and software developed in previous years by several of Levin’s former graduate students.

A new solid-state laser produces 193-nm light for precision chipmaking and even creates vortex beams with orbital angular momentum – a first that could transform quantum tech and manufacturing.

Deep ultraviolet (DUV) lasers, which emit high-energy light at very short wavelengths, play a vital role in areas like semiconductor manufacturing, high-resolution spectroscopy, precision material processing, and quantum technology. Compared to traditional excimer or gas discharge lasers, DUV lasers offer better coherence and lower power consumption, making it possible to build smaller, more efficient systems.

Breakthrough in Solid-State Laser Development.

Changes in the heart might mean more than just cardiovascular risk – they could also signal early shifts in brain health.

A large meta-analysis found that even subtle heart problems, like issues with how the heart pumps or relaxes, are linked to smaller brain volumes, particularly in areas related to memory.

Heart Issues May Signal Early Dementia Risk.

Additionally, the quantum computing cloud service offered by the University of Osaka has begun integrating OQTOPUS into its operations and Fujitsu Limited will make it available for research partners using its quantum computers in the second half of 2025.

Moving forward, the research team will drive the advancement of quantum computing through the continuous expansion of OQTOPUS’s capabilities and the development of a thriving global community. Dr. Keisuke Fujii at the Center for Quantum Information and Quantum Biology (QIQB) of The University of Osaka mentions, “this will facilitate the standardization of various quantum software and systems while driving the creation of innovative quantum applications.”

The research was funded by the Japan Science and Technology Agency and the National Institutes for Quantum Science and Technology.

The Mpemba effect, whereby hotter systems can cool faster than cooler ones under identical conditions, was first noted by Aristotle over 2,000 years ago. It was rediscovered in 1963 by Tanzanian student Erasto Mpemba, who observed the phenomenon while making ice cream during a school cooking class. Mpemba later co-authored a scientific paper with British physicist Denis Osborne, documenting the effect in water.

Following their work, researchers have found that the Mpemba effect is not limited to water or simple liquids. It has been observed in a wide range of physical systems, including microscopic ones. However, a major challenge remains: detecting the Mpemba effect depends critically on the choice of a distance measure used to track how far a system is from equilibrium.

Because there are infinitely many possible distance measures, an effect seen using one measure may not appear within any finite time using another. Traditional approaches often evaluate relaxation speed, the rate at which a system returns to equilibrium after a temperature change, using a single, monotonic measure. But this can yield inconsistent or misleading results.

NASA’s Curiosity rover has unearthed the largest organic molecules ever detected on Mars—possible fragments of fatty acids—hinting at the tantalizing possibility that prebiotic chemistry on the Red Planet may have been more advanced than previously thought. Found in a sample from Gale Crater’s Ye