Toggle light / dark theme

Axions that decay into photons could account for visible light that exceeds what’s expected to come from all known galaxies.

If you could switch off the Milky Way’s stars and gaze at the sky with a powerful telescope, you’d see the cosmic optical background (COB)—visible-wavelength light emitted by everything outside our Galaxy. Recent studies by the New Horizons spacecraft—which, after its Pluto flyby, has been looking further afield—have returned the most precise measurements of the COB yet, showing it to be brighter than expected by a factor of 2. José Bernal and his colleagues at Johns Hopkins University in Maryland propose that this excess could be caused by decaying dark matter particles called axions [1]. They say that their model could be falsified or supported by future observations.

Comparing COB measurements to predictions provides a tool for testing hypotheses about the structure of the Universe. But measuring the COB is very difficult due to contamination by diffuse light from much nearer sources, especially sunlight scattered by interplanetary dust. Observing from the edge of our Solar System, New Horizons should be unaffected by most of this contamination, making the measured excess brightness a tool for improving our understanding of galaxy evolution.

Materials that learn to change their shape in response to an external stimulus are a step closer to reality, thanks to a prototype system produced by engineers at UCLA.

Living entities constantly learn, adapting their behaviors to the environment so that they can thrive regardless of their surroundings. Inanimate materials typically don’t learn, except in science fiction movies. Now a team led by Jonathan Hopkins of the University of California, Los Angeles (UCLA), has demonstrated a so-called architected material that is capable of learning [1]. The material, which is made up of a network of beam-like components, learns to adapt its structure in response to a stimulus so that it can take on a specific shape. The team says that the material could act as a model system for future “intelligent” manufacturing.

The material developed by Hopkins and colleagues is a so-called mechanical neural network (MNN). If produced on a commercial scale, scientists think that these intelligent materials could revolutionize manufacturing in fields from building construction to fashion design. For example, an aircraft wing made from a MNN could learn to morph its shape in response to a change in wind conditions to maintain the aircraft’s flying efficiency; a house made from a MNN could adjust its structure to maintain the building’s integrity during an earthquake; and a shirt weaved from a MNN could alter its pattern so that it fits a person of any size.

Summary: Empathy is induced by synchronized neural oscillations in the right hemisphere of the brain, a new mouse study reveals.

Source: Institute for Basic Science.

A research team led by Dr. SHIN Hee-Sup at the Center for Cognition and Sociality (CCS) within the Institute for Basic Science (IBS) in Daejeon, South Korea has discovered the underlying neural mechanism that allows us to feel empathy.

Physicist Julian Barbour discusses his newest book, “The Janus Point: A New Theory of Time.” In it, Barbour makes the radical argument that the growth of order drives the passage of time — and shapes the destiny of the universe.

Read “The Janus Point”: https://www.basicbooks.com/titles/julian-barbour/the-janus-point/9780465095469/
Julian Barbour’s Website: http://www.platonia.com/

Julian Barbour is a physicist with research interests in quantum gravity and the history of science. Since receiving his PhD degree on the foundations of Albert Einstein’s general theory of relativity at the University of Cologne in 1968, Barbour has supported himself and his family without an academic position, as an author and translator.

Watch more Closer To Truth interviews with Julian Barbour: https://bit.ly/3eIW96E

When we press our temples to soothe an aching head or rub an elbow after an unexpected blow, it often brings some relief. It is believed that pain-responsive cells in the brain quiet down when these neurons also receive touch inputs, say scientists at MIT’s McGovern Institute for Brain Research, who for the first time have watched this phenomenon play out in the brains of mice.

The team’s discovery, reported Nov. 6 in the journal Science Advances, offers researchers a deeper understanding of the complicated relationship between pain and touch and could offer some insights into chronic pain in humans.

“We’re interested in this because it’s a common human experience,” says McGovern investigator Fan Wang. “When some part of your body hurts, you rub it, right? We know touch can alleviate pain in this way.” But, she says, the phenomenon has been very difficult for neuroscientists to study.