Toggle light / dark theme

Ultra-compact semiconductor could power next-gen AI and 6G chips

A research team, led by Professor Heein Yoon in the Department of Electrical Engineering at UNIST has unveiled an ultra-small hybrid low-dropout regulator (LDO) that promises to advance power management in advanced semiconductor devices. This innovative chip not only stabilizes voltage more effectively, but also filters out noise—all while taking up less space—opening new doors for high-performance system-on-chips (SoCs) used in AI, 6G communications, and beyond.

The new LDO combines analog and digital circuit strengths in a hybrid design, ensuring stable power delivery even during sudden changes in current demand—like when launching a game on your smartphone—and effectively blocking unwanted noise from the power supply.

What sets this development apart is its use of a cutting-edge digital-to-analog transfer (D2A-TF) method and a local ground generator (LGG), which work together to deliver exceptional voltage stability and noise suppression. In tests, it kept voltage ripple to just 54 millivolts during rapid 99 mA current swings and managed to restore the voltage to its proper level in just 667 nanoseconds. Plus, it achieved a rejection ratio (PSRR) of −53.7 dB at 10 kHz with a 100 mA load, meaning it can effectively filter out nearly all noise at that frequency.

Next-gen coil interface for non-contact peripheral nerve stimulation could improve treatment for chronic pain

A research team has successfully developed a next-generation coil interface capable of efficiently and safely stimulating peripheral nerves. This breakthrough is significant in that it greatly enhances the efficiency and feasibility of non-contact nerve stimulation technology, enabling stimulation through magnetic fields without the need for direct contact between electrodes and nerves.

The findings are published in the journal IEEE Transactions on Neural Systems and Rehabilitation Engineering. The team was led by Professor Sanghoon Lee from the Department of Robotics and Mechatronics Engineering at DGIST.

In recent years, there has been a growing demand for non-invasive (non-surgical, non-contact) approaches to treat peripheral nerve dysfunctions such as chronic pain, , , and facial nerve paralysis.

AI nutrition study finds ‘five everyday’ may keep the doctor away

In a new study using AI and machine learning, EPFL researchers have found that it’s not only what we eat, but how consistently we eat it that plays a crucial role in gut health.

The gut microbiota is the community of microorganisms, including bacteria, viruses, fungi and other microbes, that lives in our digestive systems—some of these microbes are helpful and others can be harmful.

Many previous studies have shown that what we eat has an impact on our gut microbiota. Healthy diets rich in fruit, vegetables, fiber and nuts are strongly associated with increased microbial diversity and better stomach health.

Advancing human leukocyte antigen-based cancer immunotherapy: from personalized to broad-spectrum strategies for genetically heterogeneous populations

Human leukocyte antigen (HLA)-based immunotherapeutics, such as tebentafusp-tebn and afamitresgene autoleucel, have expanded the treatment options for HLA-A*02-positive patients with rare solid tumors such as uveal melanoma, synovial sarcoma, and myxoid liposarcoma. Unfortunately, many patients of European, Latino/Hispanic, African, Asian, and Native American ancestry who carry non-HLA-A*02 alleles remain largely ineligible for most current HLA-based immunotherapies. This comprehensive review introduces HLA allotype-driven cancer health disparities (HACHD) as an emerging research focus, and examines how past and current HLA-targeted immunotherapeutic strategies may have inadvertently contributed to cancer health disparities. We discuss several preclinical and clinical strategies, including the incorporation of artificial intelligence (AI), to address HACHD.

New AI model for drug design brings more physics to bear in predictions

When machine learning is used to suggest new potential scientific insights or directions, algorithms sometimes offer solutions that are not physically sound.

Take, for example, AlphaFold, the AI system that predicts the complex ways in which amino acid chains will fold into 3D protein structures. The system sometimes suggests “unphysical” folds—configurations that are implausible based on the —especially when asked to predict the folds for chains that are significantly different from its .

To limit this type of unphysical result in the realm of drug design, Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences at Caltech, and her colleagues have introduced a new machine learning model called NucleusDiff, which incorporates a simple physical idea into its training, greatly improving the algorithm’s performance.

Chemical networks can mimic nervous systems to power movement in soft materials

What if a soft material could move on its own, guided not by electronics or motors, but by the kind of rudimentary chemical signaling that powers the simplest organisms? Researchers at the University of Pittsburgh Swanson School of Engineering have modeled just that—a synthetic system that on its own directly transforms chemical reactions into mechanical motion, without the need for the complex biochemical machinery present in our bodies.

Just like jellyfish, some of the simplest organisms do not have a centralized brain or . Instead, they have a “nerve net” which consists of dispersed nerve cells that are interconnected by active junctions, which emit and receive . Even without a central “processor,” the chemical signals spontaneously travel through the net and trigger the autonomous motion needed for organisms’ survival.

In a study published in PNAS Nexus, Oleg E. Shklyaev, research assistant, and Anna C. Balazs, Distinguished Professor of Chemical and Petroleum Engineering and the John A. Swanson Chair of Engineering, have developed computer simulations to design a with a “nerve net” that links chemical and mechanical networks in a way that mimics how the earliest and simplest living systems coordinate motion.

Optical system achieves terabit-per-second capacity and integrates quantum cryptography for long-term security

The artificial intelligence (AI) boom has created unprecedented demand for data traffic. But the infrastructure needed to support it faces mounting challenges. AI data centers must deliver faster, more reliable communication than ever before, while also confronting their soaring electricity use and a looming quantum security threat, which could one day break today’s encryption methods.

To address these challenges, a recent study published in Advanced Photonics proposes a quantum-secured architecture that involves minimal digital signal processing (DSP) consumption and meets all the stringent requirements for AI-driven data center optical interconnect (AI–DCI) scenarios. This system enables data to move at terabit-per-second speeds with while defending against future quantum threats.

“Our work paves the way for the next generation of secure, scalable, and cost-efficient optical interconnects, protecting AI-driven data centers against quantum security threats while meeting the high demands of modern data-driven applications,” the researchers state in their paper.

How a human ‘jumping gene’ targets structured DNA to reshape the genome

Long interspersed nuclear element-1 (LINE-1 or L1) is the only active, self-copying genetic element in the human genome—comprising about 17% of the genome. It is commonly called a “jumping gene” or “retrotransposon” because it can “retrotranspose” (move) from one genomic location to another.

Researchers from the Institute of Biophysics of the Chinese Academy of Sciences have now unveiled the molecular mechanisms that underlie L1’s retrotransposition and integration into genomic DNA. Their study was published in Science on October 9.

L1 is the only autonomously active retrotransposon in the and serves as the primary vehicle for the mobilization of most other retrotransposons. Its retrotransposition process is mediated by the reverse transcriptase ORF2p through a mechanism known as target-primed reverse transcription (TPRT). Until now, the manner in which ORF2p recognizes DNA targets and mediates integration had remained unclear.

/* */