Cambridge researchers use AI to discover effective breast cancer drug combinations from non-cancer medications successfully.

IN A NUTSHELL 🔬 Scientists have created a swarm of tiny robots that function as a dynamic, living material. 🧬 Inspired by embryonic development, these robots can change shape, self-heal, and adapt like smart materials. 💡 Equipped with light sensors and magnets, the robots coordinate their movements to transition between solid and liquid states. 🏗️
Since their discovery, the historically and biblically hugely important Dead Sea Scrolls have transformed our understanding of Jewish and Christian origins. However, while the general date of the scrolls is from the third century BCE until the second century CE, individual manuscripts thus far could not be securely dated.
Now, by combining radiocarbon dating, paleography, and artificial intelligence, an international team of researchers led by the University of Groningen has developed a date-prediction model, called Enoch, that provides much more accurate date estimates for individual manuscripts on empirical grounds.
Using this model, the researchers demonstrate that many Dead Sea Scrolls are older than previously thought. And for the first time, they establish that two biblical scroll fragments come from the time of their presumed biblical authors. They present their results in the journal PLOS One.
A trio of AI researchers at KAIST AI, in Korea, has developed what they call a Chain-of-Zoom framework that allows the generation of extreme super-resolution imagery using existing super-resolution models without the need for retraining.
In their study published on the arXiv preprint server, Bryan Sangwoo Kim, Jeongsol Kim, and Jong Chul Ye broke down the process of zooming in on an image and then used an existing super-resolution model at each step to refine the image, resulting in incremental improvements in resolution.
The team in Korea began by noting that existing frameworks for improving the resolution of pictures tend to use interpolation or regression when zooming, resulting in blurry imagery. To overcome these problems, they took a new approach—using a stepwise zooming process, in which subsequent steps improve on those that came before.
A team at the University of California, Los Angeles has developed a low-cost diagnostic pen that converts handwriting into electrical signals for early detection of Parkinson’s disease, achieving 96.22% accuracy in a pilot study.
Parkinson’s disease impairs the motor system, leading to tremors, stiffness, and slowed movements that impair fine motor functions such as handwriting. Clinical diagnosis today largely relies on subjective observations, which are prone to inconsistency and often inaccessible in low-resource settings. Biomarker-based diagnostics, while objective, remain constrained by cost and technical complexity.
In the study, “Neural network-assisted personalized handwriting analysis for Parkinson’s disease diagnostics,” published in Nature Chemical Engineering, researchers engineered a diagnostic pen to capture real-time motor signals during handwriting and convert them into quantifiable electrical outputs for disease classification.