Toggle light / dark theme

From AI to Organoids: How Growing Brain-like Structures are Advancing Machine Learning

Artificial Intelligence (AI) is usually built with silicon chips and code. But scientists are now exploring something very different. In 2025, they are growing brain organoids, which are small, living structures made from human stem cells. These organoids act like simple versions of the human brain. They form real neural connections and send electrical signals. They even show signs of learning and memory.

By linking organoids with AI systems, researchers are beginning to explore new computational approaches. Recent studies have shown that organoids possess the ability to recognize speech, detect patterns, and respond to input. Living brain tissue may help create AI models that learn and adapt faster than traditional machines. Early results indicate that organoid-based systems could offer a more flexible and energy-efficient form of intelligence.

Brain Organoids and the Emergence of Organoid Intelligence.

EpInflammAge: Epigenetic-Inflammatory Clock for Disease-Associated Biological Aging Based on Deep Learning

We present EpInflammAge, an explainable deep learning tool that integrates epigenetic and inflammatory markers to create a highly accurate, disease-sensitive biological age predictor. This novel approach bridges two key hallmarks of aging—epigenetic alterations and immunosenescence. First, epigenetic and inflammatory data from the same participants was used for AI models predicting levels of 24 cytokines from blood DNA methylation. Second, open-source epigenetic data (25 thousand samples) was used for generating synthetic inflammatory biomarkers and training an age estimation model. Using state-of-the-art deep neural networks optimized for tabular data analysis, EpInflammAge achieves competitive performance metrics against 34 epigenetic clock models, including an overall mean absolute error of 7 years and a Pearson correlation coefficient of 0.85 in healthy controls, while demonstrating robust sensitivity across multiple disease categories. Explainable AI revealed the contribution of each feature to the age prediction. The sensitivity to multiple diseases due to combining inflammatory and epigenetic profiles is promising for both research and clinical applications. EpInflammAge is released as an easy-to-use web tool that generates the age estimates and levels of inflammatory parameters for methylation data, with the detailed report on the contribution of input variables to the model output for each sample.

A qualitative systematic review on AI empowered self-regulated learning in higher education

npj Science of Learning — A qualitative systematic review on AI empowered self-regulated learning (SRL) in higher education. Aiming to synthesize empirical studies, we employed a qualitative approach to scrutinize AI’s role in supporting SRL processes. Through a meticulous selection process adhering to PRISMA guidelines, we identified 14 distinct studies that leveraged AI applications, including chatbots, adaptive feedback systems, serious games, and e-textbooks, to support student autonomy. Our findings reveal a nuanced landscape where AI demonstrates potential in facilitating SRL’s forethought, performance, and reflection phases, yet also highlights whether the agency is human-centered or AI-centered leading to variations in the SRL model. This review underscores the imperative for balanced AI integration, ensuring technological advantages are harnessed without undermining student self-efficacy. The implications suggest a future where AI is a thoughtfully woven thread in the SRL fabric of higher education, calling for further research to optimize this synergy.

Robot, know thyself: New vision-based system teaches machines to understand their bodies

In an office at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), a soft robotic hand carefully curls its fingers to grasp a small object. The intriguing part isn’t the mechanical design or embedded sensors—in fact, the hand contains none. Instead, the entire system relies on a single camera that watches the robot’s movements and uses that visual data to control it.

This capability comes from a new system CSAIL scientists developed, offering a different perspective on robotic control. Rather than using hand-designed models or complex sensor arrays, it allows robots to learn how their bodies respond to control commands, solely through vision. The approach, called Neural Jacobian Fields (NJF), gives robots a kind of bodily self-awareness.

A paper about the work was published in Nature.

Microsoft Edge now an ‘AI-powered browser’ with Copilot Mode

Microsoft has introduced Copilot Mode, an experimental feature designed to transform Microsoft Edge into a web browser powered by artificial intelligence (AI).

As the company explained on Monday, this new mode transforms Edge’s interface, with new tabs showing a single input box that combines chat, search, and web navigation functions.

Once Copilot Mode is enabled, the AI assistant will be able to analyze all open browser tabs with the user’s permission, comparing information and assisting with various tasks, such as researching vacation rentals.

/* */