Toggle light / dark theme

A small nucleus in the brainstem called locus coeruleus (literally the “blue spot,”) is the primary source of a major neuromodulator, norepinephrine (NE), an important mediator of the ‘fight or flight’ response in animals. However, very little is known about the local connections of this small albeit critically important group of neurons. A recent pioneering study published in eLife from the laboratory of Dr. Xiaolong Jiang, investigator at the Jan and Dan Duncan Neurological Research Institute (Duncan NRI) at Texas Children’s Hospital and assistant professor at Baylor College of Medicine, now reveals the cellular composition and circuit organization of the locus coeruleus in adult mice.

“In this study, we undertook the arduous task of mapping local connections of NE-producing neurons in the locus coeruleus,” Dr. Jiang said. “This is the first study of such an unprecedented magnitude and detail to be performed on the locus coeruleus, and in fact, on any monoamine neurotransmitter system. Our study has revealed that the neurons in the locus coeruleus have an unexpectedly rich cellular heterogeneity and local wiring logic.”

Locus coeruleus (LC) is known to house the vast majority of norepinephrine-releasing neurons in the brain and regulates many fundamental brain functions including the fight and flight response, sleep/wake cycles, and attention control. Present in the pontine region of the brainstem, LC neurons sense any existential dangers or threats in our external environment and send signals to alert other brain regions of the impending danger.

In the first public event presenting the Artemis III Lunar Space Suit, NASA revealed the prototype that will be worn by the first woman and person of color to go to the Moon. Made by Axiom Space, the next-gen spacesuit will eventually be white, but is currently on display with a black cover while they finalize the top layer’s final design.

The Axiom Extravehicular Mobility Unit, or AxEMU (fingers crossed this is the brief for the mission’s zero-gravity indicator plushie), got a grand reveal at Space Center Houston’s Moon 2 Mars Festival. As a prototype, it’ll join a fleet of training suits sent to NASA later this year so that astronauts can begin preparing for the next crewed lunar landing, Artemis III, set to take place in 2025.

“When that first woman steps down on the surface of the Moon on Artemis III, she’s going to be wearing an Axiom Spacesuit,” said associate administrator for NASA Bob Cabana at the reveal. “We’re going back to the Moon but we’re going to the South Pole this time. Why are we going there? It’s challenging. ”.

Breakthroughs don’t often happen in neuroscience, but we just had one. In a tour-de-force, an international team released the full brain connectivity map of the young fruit fly, described in a paper published last week in Science. Containing 3,016 neurons and 548,000 synapses, the map—called a connectome—is the most complex whole-brain wiring diagram to date.

“It’s a ‘wow,’” said Dr. Shinya Yamamoto at Baylor College of Medicine, who was not involved in the work.

Why care about a fruit fly? Far from uninvited guests at the dinner table, Drosophila melanogaster is a neuroscience darling. Although its brain is smaller than a poppy seed—a far cry from the 100 billion neurons that power human brains—the fly’s neural system shares similar principles to those that underlie our own brains.

A study of the electron excitation response of DNA to proton radiation has elucidated mechanisms of damage incurred during proton radiotherapy.

Radiobiology studies on the effects of ionizing radiation on human health focus on the deoxyribonucleic acid (DNA) molecule as the primary target for deleterious outcomes. The interaction of ionizing radiation with tissue and organs can lead to localized energy deposition large enough to instigate double strand breaks in DNA, which can lead to mutations, chromosomal aberrations, and changes in gene expression. Understanding the mechanisms behind these interactions is critical for developing radiation therapies and improving radiation protection strategies. Christopher Shepard of the University of North Carolina at Chapel Hill and his colleagues now use powerful computer simulations to show exactly what part of the DNA molecule receives damaging levels of energy when exposed to charged-particle radiation (Fig. 1) [1]. Their findings could eventually help to minimize the long-term radiation effects from cancer treatments and human spaceflight.

The interaction of radiation with DNA’s electronic structure is a complex process [2, 3]. The numerical models currently used in radiobiology and clinical radiotherapy do not capture the detailed dynamics of these interactions at the atomic level. Rather, these models use geometric cross-sections to predict whether a particle of radiation, such as a photon or an ion, crossing the cell volume will transfer sufficient energy to cause a break in one or both of the DNA strands [46]. The models do not describe the atomic-level interactions but simply provide the probability that some dose of radiation will cause a population of cells to lose their ability to reproduce.

Strongly correlated systems are systems made of particles that strongly interact with one another, to such an extent that their individual behavior depends on the behavior of all other particles in the system. In states that are far from equilibrium, these systems can sometimes give rise to fascinating and unexpected physical phenomena, such as many-body localization.

Many-body localization occurs when a system made of interacting particles fails to reach even at high temperatures. In many-body localized systems, particles thus remain in a state of non-equilibrium for long periods of time, even when a lot of energy is flowing through them.

Theoretical predictions suggest that the instability of the many-body localized phase is caused by small thermal inclusions in the strongly interacting system that act as a bath. These inclusions prompt the delocalization of the entire system, through a mechanism that is known as avalanche propagation.

Graph neural networks (GNNs) are promising machine learning architectures designed to analyze data that can be represented as graphs. These architectures achieved very promising results on a variety of real-world applications, including drug discovery, social network design, and recommender systems.

As graph-structured data can be highly complex, graph-based machine learning architectures should be designed carefully and effectively. In addition, these architectures should ideally be run on efficient hardware that support their computational demands without consuming too much power.

Researchers at University of Hong Kong, the Chinese Academy of Sciences, InnoHK Centers and other institutes worldwide recently developed a software-hardware system that combines a GNN architecture with a resistive memory, a that stores data in the form of a resistive state. Their paper, published in Nature Machine Intelligence, demonstrates the potential of new hardware solutions based on resistive memories for efficiently running graph machine learning techniques.

Water makes up 71% of Earth’s surface, but no one knows how or when such massive quantities of water arrived on Earth.

A new study published in the journal Nature brings scientists one step closer to answering that question. Led by University of Maryland Assistant Professor of Geology Megan Newcombe, researchers analyzed melted meteorites that had been floating around in space since the ’s formation 4 1/2 billion years ago. They found that these meteorites had extremely low content—in fact, they were among the driest extraterrestrial materials ever measured.

These results, which let researchers rule them out as the primary source of Earth’s water, could have important implications for the search for water—and life—on other planets. It also helps researchers understand the unlikely conditions that aligned to make Earth a habitable planet.