Menu

Blog

Archive for the ‘neuroscience’ category: Page 122

Feb 27, 2024

Nasal drops might prevent PTSD

Posted by in categories: existential risks, neuroscience

New research shows that nasal drops of neuropeptide Y triggers extinction of fear memories in an animal model of PTSD.

Feb 27, 2024

Biomarker Changes during 20 Years Preceding Alzheimer’s Disease

Posted by in categories: biotech/medical, chemistry, life extension, neuroscience

We conducted a multicenter, nested case–control study of Alzheimer’s disease biomarkers in cognitively normal participants who were enrolled in the China Cognition and Aging Study from January 2000 through December 2020. A subgroup of these participants underwent testing of cerebrospinal fluid (CSF), cognitive assessments, and brain imaging at 2-year–to–3-year intervals. A total of 648 participants in whom Alzheimer’s disease developed were matched with 648 participants who had normal cognition, and the temporal trajectories of CSF biochemical marker concentrations, cognitive testing, and imaging were analyzed in the two groups.

The median follow-up was 19.9 years (interquartile range, 19.5 to 20.2). CSF and imaging biomarkers in the Alzheimer’s disease group diverged from those in the cognitively normal group at the following estimated number of years before diagnosis: amyloid-beta (Aβ)42, 18 years; the ratio of Aβ42 to Aβ40, 14 years; phosphorylated tau 181, 11 years; total tau, 10 years; neurofilament light chain, 9 years; hippocampal volume, 8 years; and cognitive decline, 6 years. As cognitive impairment progressed, the changes in CSF biomarker levels in the Alzheimer’s disease group initially accelerated and then slowed.

In this study involving Chinese participants during the 20 years preceding clinical diagnosis of sporadic Alzheimer’s disease, we observed the time courses of CSF biomarkers, the times before diagnosis at which they diverged from the biomarkers from a matched group of participants who remained cognitively normal, and the temporal order in which the biomarkers became abnormal. (Funded by the Key Project of the National Natural Science Foundation of China and others; ClinicalTrials.gov number, NCT03653156. opens in new tab.)

Feb 27, 2024

Facial Recognition Meets Mental Health: MoodCapture App Identifies Depression Early

Posted by in categories: biotech/medical, health, mobile phones, neuroscience, robotics/AI

Can smartphones apps be used to monitor a user’s mental health? This is what a recently submitted study scheduled to be presented at the 2024 ACM CHI Conference on Human Factors in Computing Systems hopes to address as a collaborative team of researchers from Dartmouth College have developed a smartphone app known as MoodCapture capable of evaluating signs of depression from a user with the front-facing camera. This study holds the potential to help scientists, medical professionals, and patients better understand how to identify signs of depression so proper evaluation and treatment can be made.

For the study, the researchers enlisted 177 participants for a 90-day trial designed to use their front-facing camera to capture facial images throughout their daily lives and while the participants answered a survey question with, “I have felt, down, depressed, or hopeless.” All participants consented to the images being taken at random times, not only when they used the camera to unlock their phone. During the study period, the researchers obtained more than 125,000 images and even accounted for the surrounding environment in their final analysis. In the end, the researchers found that MoodCapture exhibited 75 percent accuracy when attempting to identify early signs of depression.

“This is the first time that natural ‘in-the-wild’ images have been used to predict depression,” said Dr. Andrew Campbell, who is a professor in the Computer Science Department at Dartmouth and a co-author on the study. “There’s been a movement for digital mental-health technology to ultimately come up with a tool that can predict mood in people diagnosed with major depression in a reliable and non-intrusive way.”

Feb 27, 2024

Frontiers: Neuromorphic engineering (NE) encompasses a diverse range of approaches to information processing that are inspired by neurobiological systems

Posted by in categories: biotech/medical, information science, neuroscience, robotics/AI, supercomputing

And this feature distinguishes neuromorphic systems from conventional computing systems. The brain has evolved over billions of years to solve difficult engineering problems by using efficient, parallel, low-power computation. The goal of NE is to design systems capable of brain-like computation. Numerous large-scale neuromorphic projects have emerged recently. This interdisciplinary field was listed among the top 10 technology breakthroughs of 2014 by the MIT Technology Review and among the top 10 emerging technologies of 2015 by the World Economic Forum. NE has two-way goals: one, a scientific goal to understand the computational properties of biological neural systems by using models implemented in integrated circuits (ICs); second, an engineering goal to exploit the known properties of biological systems to design and implement efficient devices for engineering applications. Building hardware neural emulators can be extremely useful for simulating large-scale neural models to explain how intelligent behavior arises in the brain. The principal advantages of neuromorphic emulators are that they are highly energy efficient, parallel and distributed, and require a small silicon area. Thus, compared to conventional CPUs, these neuromorphic emulators are beneficial in many engineering applications such as for the porting of deep learning algorithms for various recognitions tasks. In this review article, we describe some of the most significant neuromorphic spiking emulators, compare the different architectures and approaches used by them, illustrate their advantages and drawbacks, and highlight the capabilities that each can deliver to neural modelers. This article focuses on the discussion of large-scale emulators and is a continuation of a previous review of various neural and synapse circuits (Indiveri et al., 2011). We also explore applications where these emulators have been used and discuss some of their promising future applications.

“Building a vast digital simulation of the brain could transform neuroscience and medicine and reveal new ways of making more powerful computers” (Markram et al., 2011). The human brain is by far the most computationally complex, efficient, and robust computing system operating under low-power and small-size constraints. It utilizes over 100 billion neurons and 100 trillion synapses for achieving these specifications. Even the existing supercomputing platforms are unable to demonstrate full cortex simulation in real-time with the complex detailed neuron models. For example, for mouse-scale (2.5 × 106 neurons) cortical simulations, a personal computer uses 40,000 times more power but runs 9,000 times slower than a mouse brain (Eliasmith et al., 2012). The simulation of a human-scale cortical model (2 × 1010 neurons), which is the goal of the Human Brain Project, is projected to require an exascale supercomputer (1018 flops) and as much power as a quarter-million households (0.5 GW).

The electronics industry is seeking solutions that will enable computers to handle the enormous increase in data processing requirements. Neuromorphic computing is an alternative solution that is inspired by the computational capabilities of the brain. The observation that the brain operates on analog principles of the physics of neural computation that are fundamentally different from digital principles in traditional computing has initiated investigations in the field of neuromorphic engineering (NE) (Mead, 1989a). Silicon neurons are hybrid analog/digital very-large-scale integrated (VLSI) circuits that emulate the electrophysiological behavior of real neurons and synapses. Neural networks using silicon neurons can be emulated directly in hardware rather than being limited to simulations on a general-purpose computer. Such hardware emulations are much more energy efficient than computer simulations, and thus suitable for real-time, large-scale neural emulations.

Feb 27, 2024

Can We Upload Our Minds to a Computer?

Posted by in categories: computing, neuroscience

Unless we solve the problem of consciousness, the endeavour remains a dead end.

Feb 27, 2024

Chinese philosopher’s brain frozen for science, causing stir among scholars

Posted by in categories: cryonics, life extension, neuroscience, science

Science and Technology: I don’t want to die.


A friend of the academic, who died in the US in 2021, says his brain has been cryonically preserved in accordance with his final wishes.

Feb 27, 2024

Alzheimer’s: Abdominal fat linked to poor brain health, cognition

Posted by in categories: biotech/medical, health, neuroscience

A new study suggests abdominal fat could impact brain health and cognition among people with a high risk of developing Alzheimer’s disease. Researchers found that middle-aged males at risk for Alzheimer’s who had higher amounts of pancreatic fat had lower cognition and brain volumes.

Feb 27, 2024

The Strange Spooky Possibilities of Neural Dust

Posted by in category: neuroscience

An exploration of brain augmentation especially that which involves neural dust. The new JMG Clips channel for sleep!https://youtu.be/Lf2iz6LqCxQ?si=S-qdipBH

Feb 27, 2024

First-in-human study tracks brain response to economic exchange

Posted by in categories: biotech/medical, neuroscience

New study links dopamine and serotonin changes in a brain region to social perception and decision-making regarding offer acceptance.


Scientists unravel dopamine-serotonin dynamics in first-in-humans study, unveiling social decision-making during awake surgery in Parkinson’s patients’ tracking economic exchange.

Feb 27, 2024

Why do we sleep?

Posted by in category: neuroscience

Yang Dan aims to elucidate the mechanisms in the mammalian brain that control sleep. Find out more about Dan’s research in this feature.