Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

Nobel Prize in Physics 2024

Thanks to their work from the 1980s and onward, John Hopfield and Geoffrey Hinton have helped lay the foundation for the machine learning revolution that started around 2010.

The development we are now witnessing has been made possible through access to the vast amounts of data that can be used to train networks, and through the enormous increase in computing power. Today’s artificial neural networks are often enormous and constructed from many layers. These are called deep neural networks and the way they are trained is called deep learning.

A quick glance at Hopfield’s article on associative memory, from 1982, provides some perspective on this development. In it, he used a network with 30 nodes. If all the nodes are connected to each other, there are 435 connections. The nodes have their values, the connections have different strengths and, in total, there are fewer than 500 parameters to keep track of. He also tried a network with 100 nodes, but this was too complicated, given the computer he was using at the time. We can compare this to the large language models of today, which are built as networks that can contain more than one trillion parameters (one million millions).

How neuron groups team up to embed memories in context

Humans have the remarkable ability to remember the same person or object in completely different situations. We can easily distinguish between dinner with a friend and a business meeting with the same friend. “We already know that deep in the memory centers of the brain, specific cells, called concept neurons, respond to this friend, regardless of the environment in which he appears,” says Prof. Florian Mormann from the Clinic for Epileptology at the UKB, who is also a member of the Transdisciplinary Research Area (TRA) Life & Health at the University of Bonn.

However, the brain must be able to combine this content with the context in order to form a useful memory. In rodents, individual neurons often mix these two pieces of information. “We asked ourselves: Does the human brain function fundamentally differently here? Does it map content and context separately to enable a more flexible memory? And how do these separate pieces of information connect when we need to remember specific content according to context?” says Dr. Marcel Bausch, working group leader at the Department of Epileptology and member of TRA Life & Health at the University of Bonn.

Bright light suppresses eating and weight gain in mice

Past research has found that exposure to bright lights and high levels of noise can alter both physiological processes and human behavior. For instance, an elevated or limited exposure to bright lights and noise has been found to influence people’s sleeping patterns, circadian rhythm, mood, metabolism, stress levels and mental performance.

Researchers at Jinan University and other institutes in China recently carried out a new study involving mice, exploring the possibility that the exposure to bright lights also influences eating behavior and body weight. Their findings, published in Nature Neuroscience, suggest that bright light exposure suppresses food consumption in mice and can lead to weight loss, while also identifying neural processes that could support these light-induced changes in feeding behavior.

“Environmental light regulates nonimage-forming functions like feeding, and bright light therapy shows anti-obesity potential, yet its neural basis remains unclear,” wrote Wen Li, Xiaodan Huang and their colleagues in their paper. “We show that bright light treatment effectively reduces food intake and mitigates weight gain in mice through a visual circuit involving the lateral hypothalamic area (LHA).”

Making the invisible visible: Space particles become observable through handheld invention

You can’t see, feel, hear, taste or smell them, but tiny particles from space are constantly raining down on us.

They come from cosmic rays—high-energy particles that can originate from exploding stars and other extreme astrophysical events far beyond our solar system. When the rays collide with atoms high in Earth’s protective atmosphere, they trigger a cascade of secondary particles. Among the most important of these new particles are muons, which can pass through the atmosphere and even penetrate into the ground.

An invention by University of Delaware physics professor Spencer Axani called CosmicWatch is putting the science of muons in the palms of experienced scientists and high school students alike.

New evidence for a particle system that ‘remembers’ its previous quantum states

In the future, quantum computers are anticipated to solve problems once thought unsolvable, from predicting the course of chemical reactions to producing highly reliable weather forecasts. For now, however, they remain extremely sensitive to environmental disturbances and prone to information loss.

A new study from the lab of Dr. Yuval Ronen at the Weizmann Institute of Science, published in Nature, presents fresh evidence for the existence of non-Abelian anyons—exotic particles considered prime candidates for building a fault-tolerant quantum computer. This evidence was produced within bilayer graphene, an ultrathin carbon crystal with unusual electronic behavior.

In quantum mechanics, particles also behave like waves, and their properties are described by a wave function, which can represent the state of a single particle or a system of particles. Physicists classify particles according to how the wave function of two identical particles changes when they exchange places. Until the 1980s, only two types of particles were known: bosons (such as photons), whose wave function remains unchanged when they exchange places, and fermions (such as electrons), whose wave function becomes inverted.

Scientists use string theory to crack the code of natural networks

For more than a century, scientists have wondered why physical structures like blood vessels, neurons, tree branches, and other biological networks look the way they do. The prevailing theory held that nature simply builds these systems as efficiently as possible, minimizing the amount of material needed. But in the past, when researchers tested these networks against traditional mathematical optimization theories, the predictions consistently fell short.

The problem, it turns out, was that scientists were thinking in one dimension when they should have been thinking in three. “We were treating these structures like wire diagrams,” Rensselaer Polytechnic Institute (RPI) physicist Xiangyi Meng, Ph.D., explains. “But they’re not thin wires, they’re three-dimensional physical objects with surfaces that must connect smoothly.”

This month, Meng and colleagues published a paper in the journal Nature showing that physical networks in living systems follow rules borrowed from an unlikely source: string theory, the exotic branch of physics that attempts to explain the fundamental structure of the universe.

Antiferromagnetic metal exhibits diode-like behavior without external magnetic field

Antiferromagnetic (AF) materials are made up of atoms or molecules with atomic spins that align in antiparallel directions of their neighbors. The magnetism of each individual atom or molecule is canceled out by the one next to it to produce zero net magnetization.

Researchers in Japan have now discovered that an AF material, NdRu2Al10, has the ability to produce a diode-like effect, meaning electrical current can flow in one direction but not the other (nonreciprocal), similar to the junction of two semiconductors. Their research is published in Physical Review Letters.

/* */