Scientists have built a “microwave brain” chip that processes information at radar-like speeds while sipping power. It could revolutionize how AI and communication devices operate, from smartwatches to satellites.
One of the key steps in developing new materials is property identification, which has long relied on massive amounts of experimental data and expensive equipment, limiting research efficiency. A KAIST research team has introduced a new technique that combines physical laws, which govern deformation and interaction of materials and energy, with artificial intelligence. This approach allows for rapid exploration of new materials even under data-scarce conditions and provides a foundation for accelerating design and verification across multiple engineering fields, including materials, mechanics, energy, and electronics.
Professor Seunghwa Ryu’s research group in the Department of Mechanical Engineering, in collaboration with Professor Jae Hyuk Lim’s group at Kyung Hee University and Dr. Byungki Ryu at the Korea Electrotechnology Research Institute, proposed a new method that can accurately determine material properties with only limited data. The method uses physics-informed machine learning (PIML), which directly incorporates physical laws into the AI learning process.
In the first study, the researchers focused on hyperelastic materials, such as rubber. They presented a physics-informed neural network (PINN) method that can identify both the deformation behavior and the properties of materials using only a small amount of data obtained from a single experiment. Whereas previous approaches required large, complex datasets, this research demonstrated that material characteristics can be reliably reproduced even when data is scarce, limited, or noisy.
A research team led by Kyushu University has developed a new fabrication method for energy-efficient magnetic random-access memory (MRAM) using a new material called thulium iron garnet (TmIG) that has been attracting global attention for its ability to enable high-speed, low-power information rewriting at room temperature. The team hopes their findings will lead to significant improvements in the speed and power efficiency of high-computing hardware, such as that used to power generative AI.
The work is published in npj Spintronics.
The rapid spread of generative AI has made the power demand from data centers a global issue, creating an urgent need to improve the energy efficiency of the hardware that runs the technology.
Diagnosing cancer at an early stage increases the chance of performing effective treatment in many tumour groups. Key approaches include screening patients who are at risk but have no symptoms, and rapidly and appropriately investigating those who do. Machine learning, whereby computers learn complex data patterns to make predictions, has the potential to revolutionise early cancer diagnosis. Here, we provide an overview of how such algorithms can assist doctors through analyses of routine health records, medical images, biopsy samples and blood tests to improve risk stratification and early diagnosis. Such tools will be increasingly utilised in the coming years.
Researchers have developed a way to predict how lung cancer cells will respond to different therapies, allowing people with the most common form of lung cancer to receive more effective individualized treatment.
The research, published Oct. 10 in Nature Genetics, was led by Thazin Aung, Ph.D., in the laboratory of Yale School of Medicine’s David Rimm, MD, Ph.D., in collaboration with scientists at the Frazer Institute at the University of Queensland. Researchers studied the tumors of 234 patients with non-small cell lung cancer (NSCLC) across three cohorts in Australia, the United States, and Europe.
“Using AI and spatial biology, we mapped NSCLC, cell-by-cell, to understand and predict its response to drug treatment,” Aung says. “This ‘Google Maps’ approach can pinpoint areas within tumors that are both responsive and resistant to therapies, which will be a gamechanger for lung cancer treatment. Rather than having to use a trial-and-error approach, oncologists will now know which treatments are most likely to work with new precision medicine tools.”
In a monumental leap for healthcare innovation, China has opened the world’s first fully AI-powered hospital, staffed by 14 artificial intelligence “doctors” capable of diagnosing, treating, and managing up to 10,000 virtual patients per day.
This revolutionary facility, developed by Tsinghua University, is called the Smart Hospital of the Future — and it may represent the most advanced experiment in AI-driven medicine the world has ever seen.
Designed as a testbed for AI medical systems, the hospital blends robotics, machine learning, natural language processing, and big data analytics to simulate full-spectrum care at lightning speed — with zero fatigue, no paperwork errors, and real-time updates from global medical databases.
USC engineers have developed an optical system that routes light autonomously using thermodynamic principles. Rather than relying on switches, light organizes itself much like particles in a gas reaching equilibrium. The discovery could simplify and speed up optical communications and computing. It reimagines chaotic optical behavior as a tool for design rather than a limitation.