Toggle light / dark theme

An unplugged electric instrument may function, but it sounds much better when it is connected to an amplifier. Similarly, toxins and other small molecules at low concentrations in the environment or human body may emit quiet signals that are undetectable without specialized lab technology.

Now, thanks to a “cool trick” in biochemistry used to adapt a sensing platform already being deployed by Northwestern scientists to measure toxins in drinking water, researchers can detect and even measure chemicals at low enough concentrations to have use outside the lab. By attaching circuitry akin to a volume knob to “turn up” weak signals, the team has opened the door for the system to be applied to disease detection and monitoring in the human body for like DNA and RNA, as well as bacteria such as E. coli.

The results, which describe a system that is 10 times more sensitive than previous cell-free sensors built by the team, are published in the journal Nature Chemical Biology.

Introduction: Hyperthermia is an established adjunct in multimodal cancer treatments, with mechanisms including cell death, immune modulation, and vascular changes. Traditional hyperthermia applications are resource-intensive and often associated with patient morbidity, limiting their clinical accessibility. Gold nanorods (GNRs) offer a precise, minimally invasive alternative by leveraging near-infrared (NIR) light to deliver targeted hyperthermia therapy (THT). THT induces controlled tumor heating, promoting immunogenic cell death (ICD) and modulating the tumor microenvironment (TME) to enhance immune engagement. This study explores the synergistic potential of GNR-mediated THT with immunotherapies in immunogenically ‘cold’ tumors to achieve durable anti-tumor immunity.

Methods: GNRs from Sona Nanotech Inc.™ were intratumorally injected and activated using NIR light to induce mild hyperthermia (42–48°C) for 5 minutes. Tumor responses were analyzed for cell death pathways and immune modulation. The immunogenic effects of THT were assessed alone and in combination with intratumoral interleukin-2 (i.t. IL-2) or systemic PD-1 immune checkpoint blockade. Immune cell infiltration, gene expression changes, and tumor growth kinetics were evaluated.

Results: THT reduced tumor burden through cell death mechanisms, including upregulated ICD marked by calreticulin exposure within 48 hours. By 48 hours, CD45+ immune cell levels were increased, including increased levels of immunosuppressive M2 macrophages. While THT led to innate immune cell stimulations highlighted by gene expression upregulation in the STING cGAS pathway and enhanced M1 and dendritic cell levels, tumor regrowth was observed within six days post-treatment. To enhance THT’s immunogenic effects, the therapy was combined with intratumoral interleukin-2 (i.t. IL-2) or systemic PD-1 immune checkpoint blockade. Sequential administration of i.t. IL-2 post-THT induced robust CD8+ T-cell infiltration and led to sustained tumor regression in both treated and distant tumors, accompanied by the emergence of memory T cells. However, IL-2-induced immunosuppressive T-reg populations were also sustained to tumor endpoint suggesting that therapy could be further enhanced.

They found that when people with aphantasia try to conjure an image in their mind’s eye, the primary visual cortex – the part of the brain that processes picture-like visual information – is activated, but any images that are produced remain unconscious to the individual.

Published today in Current Biology, opens in a new window, the study, carried out by scientists at UNSW and South China Normal University, used a range of techniques to measure brain activity. Their findings challenge the existing theory that activity in the primary visual cortex directly produces conscious visual imagery.

“People with aphantasia actually do seem to have images of a sort, they remain too weak or distorted to become conscious or be measured by our standard measurement techniques,” says Prof. Joel Pearson, a co-author of the study based at UNSW’s School of Psychology, opens in a new window. “This may be because the visual cortex is wired differently, as evidenced by the data in this new study. This research not only deepens our understanding of the brain but also pushes the boundaries of how we think about imagination and consciousness.”

A new biodegradable electrode stimulates brain repair by activating neural precursor cells, dissolving naturally after a week. This breakthrough could transform treatments for neurological disorders like stroke.


Summary: Researchers have developed a flexible, biodegradable electrode capable of stimulating neural precursor cells (NPCs) in the brain, offering a safer and more precise alternative for neural repair. The electrode dissolves naturally after seven days, eliminating the need for surgical removal while promoting tissue regeneration.

Made from FDA-approved materials, the device successfully increased NPC activity in preclinical models without causing significant inflammation or damage. This innovation could significantly expand treatment options for neurological disorders, which are a leading cause of disability worldwide.

Future developments aim to integrate drug and gene therapy delivery into the electrodes for enhanced therapeutic potential.

Western researchers have developed a novel technique using math to understand exactly how neural networks make decisions—a widely recognized but poorly understood process in the field of machine learning.

Many of today’s technologies, from digital assistants like Siri and ChatGPT to and self-driving cars, are powered by machine learning. However, the —computer models inspired by the —behind these machine learning systems have been difficult to understand, sometimes earning them the nickname “” among researchers.

“We create neural networks that can perform , while also allowing us to solve the equations that govern the networks’ activity,” said Lyle Muller, mathematics professor and director of Western’s Fields Lab for Network Science, part of the newly created Fields-Western Collaboration Centre. “This mathematical solution lets us ‘open the black box’ to understand precisely how the network does what it does.”

Zerodha co-founder Nithin Kamath triggered a conversation online after he shared the wisdom of 92-year-old US-based mathematician and professor Edward Thorp on longevity on social media.

In a post on X, Kamath praised Thorp’s advice, calling it “brilliant” and stating, “This is the only longevity expert you need to listen to.”

Thorp’s message delves into a balanced approach to living a long and healthy life. His philosophy combines “defence,” which involves mitigating risks like cardiovascular diseases through diet, exercise, and regular check-ups, and “offence,” with an emphasis on exercise as a “magic bullet” to extend both lifespan and health span.

DLB is a common cause of dementia. It starts by the abnormal accumulation of the protein alpha-synuclein in the brain. This produces degeneration of the brain and causes problems with thinking, movement, and behavior. Eventually, the disease leads to dementia and death. Doctors use a imaging technique called FDG-PET to assess how the brain is affected in DLB. However, until now, there was no information on how these brain changes develop over time.

The study, led by Dr. Daniel Ferreira at the Department of Neurobiology, Care Sciences and Society, followed 35 patients with DLB, 37 patients with early-stage DLB (called prodromal DLB), and 100 healthy people from Mayo Clinic (USA), for an average of 3.8 years. The researchers found that brain degeneration starts early in prodromal DLB and worsens as the disease progresses.

“We discovered that people with prodromal DLB had faster degeneration in certain brain areas compared to healthy individuals,” said Dr. Ferreira.” This information is crucial for monitoring disease progression from early stages and planning clinical trials for new treatments.”

longitudinal FDG-PET metabolic change along the lewy body.

Second, Synchron will explore the development of a groundbreaking foundation model for brain inference. By processing Synchron’s neural data on an unprecedented scale, this initiative will create scalable, interpretable brain-language models with the potential to transform neuroprosthetics, cognitive expression, and seamless interaction with digital devices.

“Synchron’s vision is to scale neurotechnology to empower humans to connect to the world, and the NVIDIA Holoscan platform provides the ideal foundation,” said Tom Oxley, M.D., Ph.D., CEO & Founder, Synchron. “Through this work, we’re setting a new benchmark for what BCIs can achieve.”


NEW YORK—()— Synchron, a category-defining brain-computer interface (BCI) company, announced today a step forward in implantable BCI technology to drive the future of neurotechnology. Synchron’s BCI technology, in combination with the NVIDIA Holoscan platform, is poised to redefine the possibilities of real-time neural interaction and intelligent edge processing.

Synchron will leverage NVIDIA Holoscan to advance a next-generation implantable BCI in two key domains. First, Synchron will enhance real-time edge AI capabilities for on-device neural processing, improving signal processing and multi-AI inference technology. This will reduce system latency, bolster privacy, and provide users with a more responsive and intuitive BCI experience. NVIDIA Holoscan provides Synchron with: (i) a unified framework supporting diverse AI models and data modalities; (ii) an optimized application framework, from seamless sensor I/O integration, GPU-direct data ingestion, to accelerated computing and real-time AI.

Source: NUS

Researchers have uncovered novel insights into how brain function disruptions related to cerebrovascular disease (CeVD) interact with Alzheimer’s disease (AD) pathology to impact neurodegeneration and cognition in older adults.

Led by Associate Professor Juan Helen Zhou, Director of the Centre for Translational Magnetic Resonance Research, Yong Loo Lin School of Medicine, National University of Singapore (NUS Medicine), the research team revealed a brain functional connectome phenotype that is related to multiple CeVD markers and contributes additively to cognitive decline and neurodegeneration alongside AD.

Red light exposure may reduce blood clot risks, according to groundbreaking research. By lowering inflammation and platelet activity, it could prevent strokes, heart attacks, and more. Clinical trials are next.


The ability of released products of platelet activation to induce thrombosis-generating neutrophil extracellular trap formation was quantified. Subsequent thrombosis was measured using murine models of VT and stroke.

To translate our findings to human patients, light-filtering cataract patients were evaluated over an 8-year period for rate of venous thromboembolism with multivariable logistic regression clustered by hospital.

Exposure to long-wavelength red light resulted in reduced platelet aggregation and activation. RNA-seq analysis demonstrated no significant transcriptomic changes between micered and micewhite.