Toggle light / dark theme

Alternative models for studying aging have employed unicellular organisms such as the budding yeast Saccharomyces cerevisiae. Studying replicative aging in yeast has revealed insights into evolutionarily conserved enzymes and pathways regulating aging[ 12-14 ] as well as potential interventions for mitigating its effects.[ 15 ] However, traditional yeast lifespan analysis on agar plates and manual separation cannot track molecular markers and yeast biology differs from humans.[ 16 ]

Animal models, including nematodes, flies, and rodents, play a vital role in aging research due to their shorter lifespans and genetic manipulability, making them useful for mimicking human aging phenotypes.[ 17 ] These models have provided many insights into the fundamental understanding of aging mechanism. However, animal models come with several limitations when applied to human aging and age-related diseases. Key issues include limited generalizability due to species-specific differences in disease manifestation and physiological traits. For example, animal models often exhibit physiological differences, age at different rates, and may not fully replicate human conditions like cardiovascular disease,[ 18 ] immune response,[ 19 ] neurodegenerative diseases,[ 20 ] and drug metabolism.[ 21 ] Furthermore, in vivo models, such as rodents and non-human primates, suffer from limitations such as high costs, low throughput, ethical concerns, and physiological differences compared to humans. The use of shorter lifespan or accelerated aging models, along with the absence of long-term longitudinal data, can further distort the natural aging process and hinder our understanding of aging in humans. Additionally, many animal models rely on inbred strains, which lack genetic diversity and may not fully represent evolutionary complexity.[ 22 ]

In recent years, microfluidics has emerged as a promising tool for studying aging, offering of physiologically relevant 3D environments with high-throughput capabilities that surpass the limitations of traditional 2D cultures and bridge the gap between animal models and human As a multidisciplinary technology, microfluidics processes or manipulates small volumes of fluids (from pico to microliters) within channels measuring 10–1000 µm.[ 23 ] Traditional fabrication methods, such as photolithography and soft lithography, particularly using polydimethylsiloxane (PDMS), remain widely used due to their cost-effectiveness and biocompatibility. However, newer approaches, including 3D printing, injection molding, and laser micromachining, offer greater flexibility for rapid prototyping and the creation of complex architectures. Design considerations are equally critical and are tailored to the specific application, focusing on parameters such as channel geometry, fluid dynamics, material properties, and the integration of on-chip components like valves, sensors, and actuators. A comprehensive overview of the design and fabrication of microphysiological systems is beyond the scope of this review; readers are referred to existing reviews for further detail.[ 24-26 ] Microfluidic devices offer numerous advantages, including reduced resource consumption and costs, shorter culture times, and improved simulation of pathophysiological conditions in 3D cellular systems compared to other model systems (Figure 1).[ 27 ] Therefore, microfluidics platforms have been extensively employed in various domains of life science research, such as developmental biology, disease modeling, drug discovery, and clinical applications,[ 28 ] positioning this technology as a significant avenue in the field of aging research.

Insights from a new study could help unlock the full potential of a developing form of smaller-scale wind power generation, researchers say.

Engineers from the University of Glasgow have used sophisticated computer simulations of bladeless wind turbines (BWTs) to identify for the first time how future generations of the technology could be built for .

The team’s paper, titled “Performance analysis and geometric optimisation of bladeless wind turbines using wake oscillator model,” is published in Renewable Energy.

Recent technological advances have opened new exciting possibilities for the development of cutting-edge quantum devices, including quantum random access memory (QRAM) systems. These are memory architectures specifically meant to be integrated inside quantum computers, which can simultaneously retrieve data from multiple ‘locations’ leveraging a quantum effect known as coherent superposition.

A new study led by researchers at the Universities of Oxford, Cambridge and Manchester has achieved a major advance in quantum materials, developing a method to precisely engineer single quantum defects in diamond—an essential step toward scalable quantum technologies. The results have been published in the journal Nature Communications.

Using a new two-step fabrication method, the researchers demonstrated for the first time that it is possible to create and monitor, “as they switch on,” individual Group-IV quantum defects in diamond—tiny imperfections in the diamond that can store and transmit information using the exotic rules of quantum physics.

By carefully placing single tin atoms into synthetic diamond crystals and then using an ultrafast laser to activate them, the team achieved pinpoint control over where and how these quantum features appear. This level of precision is vital for making practical, large-scale quantum networks capable of ultra-secure communication and distributed quantum computing to tackle currently unsolvable problems.

As quantum computing develops, scientists are working to identify tasks for which quantum computers have a clear advantage over classical computers. So far, researchers have only pinpointed a handful of these problems, but in a new paper published in Physical Review Letters, scientists at Los Alamos National Laboratory have added one more problem to this very short list.

“One of the central questions that faces is what classes of problems they can most efficiently solve but cannot,” says Marco Cerezo, the Los Alamos team’s lead scientist. “At the moment, this is the Holy Grail of quantum computing, because you can count on two hands such problems. In this paper, we’ve just added another.”

Quantum computing harnesses the unique laws of quantum physics, such as superposition, entanglement and interference, which allow for information processing capabilities beyond those of classical devices. When fully realized, quantum computing promises to make advancements in cryptography, simulations of quantum systems and data analysis, among many other fields. But before this can happen, researchers still need to develop the foundational science of quantum computing.

The Hong Kong University of Science and Technology (HKUST)-led research team has adopted gyromagnetic double-zero-index metamaterials (GDZIMs)—a new optical extreme-parameter material—and developed a new method to control light using GDZIMs. This discovery could revolutionize fields like optical communications, biomedical imaging, and nanotechnology, enabling advances in integrated photonic chips, high-fidelity optical communication, and quantum light sources.

The study published in Nature was co-led by Prof. Chan Che-Ting, Interim Director of the HKUST Jockey Club Institute for Advanced Study and Chair Professor in the Department of Physics, and Dr. Zhang Ruoyang, Visiting Scholar in the Department of Physics at HKUST.

For the past six years, Los Alamos National Laboratory has led the world in trying to understand one of the most frustrating barriers that faces variational quantum computing: the barren plateau.

“Imagine a landscape of peaks and valleys,” said Marco Cerezo, the Los Alamos team’s lead scientist. “When optimizing a variational, or parameterized, , one needs to tune a series of knobs that control the solution quality and move you in the landscape. Here, a peak represents a bad solution and a valley represents a good solution. But when researchers develop algorithms, they sometimes find their model has stalled and can neither climb nor descend. It’s stuck in this space we call a barren .”

For these quantum computing methods, barren plateaus can be mathematical dead ends, preventing their implementation in large-scale realistic problems. Scientists have spent a lot of time and resources developing quantum algorithms only to find that they sometimes inexplicably stall. Understanding when and why barren plateaus arise has been a problem that has taken the community years to solve.

Researchers have determined how to use magnons—collective vibrations of the magnetic spins of atoms—for next-generation information technologies, including quantum technologies with magnetic systems.

From the computer hard drives that store our data to the motors and engines that drive power plants, magnetism is central to many transformative technologies. Magnetic materials are expected to play an even larger role in new technologies on the horizon: the transmission and processing of quantum information and the development of quantum computers.

New research led by scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory developed an approach to control the collective magnetic properties of atoms in real time and potentially deploy them for next-generation information technologies. This discovery could aid in developing future quantum computers, which can perform tasks that would be impossible using today’s computers, as well as “on chip” technologies—with magnetic systems embedded on semiconductor chips, or “on chip.”