Toggle light / dark theme

A fundamental goal in the field of sensory neuroscience is to understand the complex mechanisms that underlie the neural code responsible for processing natural visual scenes. In neuroscience, a fundamental yet unresolved question is how neural circuits are developed in natural settings by the interaction of multiple cell types. The eyes have evolved to communicate information about natural visual scenes using a wide range of interneurons, which is crucial for transmitting visual information to the brain.

Retina’s functioning is largely based on research into how it reacts to artificial stimuli like flashing lights and noise. These might not accurately represent how the retina interprets actual visual data. The complexity of how these more than 50 different types of interneurons contribute to retinal processing has yet to be fully understood despite the fact that different computations have been detected using such methods. In a recent research paper, a group of researchers has made a significant advancement by showing that a three-layer network model is capable of predicting retinal responses to natural sceneries with amazing precision, almost exceeding the bounds of experimental data. The researchers wanted to understand how the brain processes natural visual scenes, so they focussed on the retina, which is part of the eye that sends signals to the brain.

This model’s interpretability, i.e., the ability to comprehend and examine its internal organization, is one of its key characteristics. There is a strong correlation between the responses of interneurons that were directly included in the model and those that were separately recorded. This suggests that the model captures significant aspects of the retinal interneuron activity. It successfully reproduces a wide range of motion analysis, adaptability, and predictive coding phenomena when they are just trained on natural scenes. On the other hand, models trained on white noise cannot reproduce the same set of events, supporting the idea that examining natural sceneries is necessary to comprehend natural visual processing.

In the ’80s, the spy agency investigated the “Gateway Experience” technique to alter consciousness and ultimately escape spacetime. Here is everything you need to know.


She turned to me the other morning and said, “You heard of The Gateway?” It didn’t register in the moment. She continued, “It’s blowing up on TikTok.” Later on, she elaborated: It was not in fact the ill-fated ’90s computer hardware company folks were freaking out about. No, they’ve gone further back in time, to find a true treasure of functional media.

The intrigue revolves around a classified 1983 CIA report on a technique called the Gateway Experience, which is a training system designed to focus brainwave output to alter consciousness and ultimately escape the restrictions of time and space. The CIA was interested in all sorts of psychic research at the time, including the theory and applications of remote viewing, which is when someone views real events with only the power of their mind. The documents have since been declassified and are available to view.

An ancient relative of modern seals—known as Potamotherium valletoni—that had an otter-like appearance and lived over 23 million years ago likely used its whiskers to forage for food and explore underwater environments, according to a new study in Communications Biology. The findings provide further insight into how ancient seals transitioned from life on land to life underwater.

Although modern seals live in and use their to locate food by sensing vibrations in the water, ancient seal relatives mostly lived on land or in freshwater environments. Some species used their forelimbs to explore their surroundings. Prior to this study, it was unclear when seals and their relatives began using their whiskers to forage.

Alexandra van der Geer and colleagues investigated the evolution of whisker-foraging behaviors in seals by comparing the brain structures of Potamotherium with those of six extinct and 31 living meat-eating mammals, including mustelids, bears, and seal relatives. Brain structures were inferred from casts taken from the inside of skulls.

A recent study used special eye-tracking technology to investigate how people look at each other’s eyes and faces during conversations. The researchers, who published their results in Scientific Reports, found that people who exhibited more direct eye-to-eye contact during their conversation tended to also be better at following the direction of another’s gaze (they were better at understanding where the other person was looking). The research provides unique insights into non-verbal communication.

Much of human social communication occurs nonverbally, and eye contact plays a crucial role in allowing individuals to convey and interpret information such as attention, mental states, intentions, and emotions. Eye contact is not only passively received but also reciprocated through mutual looks.

The researchers wanted to examine the frequency and types of mutual looking behaviors, such as direct eye-to-eye contact and other gaze interactions involving different parts of the face. They were also interested in understanding how the mutual looking behaviors observed during interactions might influence subsequent gaze-following behavior.

Neuroscientists today report the first results from experimental tests designed to explore the idea that “forgetting” might not be a bad thing, and that it may represent a form of learning—and outline results that support their core idea.

Last year the neuroscientists behind the new theory suggested that changes in our ability to access specific memories are based on environmental feedback and predictability. And that rather than being a bug, may be a functional feature of the brain, allowing it to interact dynamically with a dynamic environment.

In a changing world like the one we and many other organisms live in, forgetting some memories would be beneficial, they reasoned, as this can lead to more flexible behavior and better decision-making. If memories were gained in circumstances that are not wholly relevant to the current environment, forgetting them could be a positive change that improves our well-being.

The research, published in Proceedings of the National Academy of Sciences, used a genetic approach to fix deafness in mice with a defective Spns2 gene, restoring their hearing abilities in low and middle frequency ranges. Researchers say this proof-of-concept study suggests that hearing impairment resulting from reduced gene activity may be reversible.

Over half of adults in their 70s experience significant hearing loss. Impaired hearing is associated with an increased likelihood of experiencing depression and cognitive decline, as well as being a major predictor of dementia. While hearing aids and cochlear implants may be useful, they do not restore normal hearing function, and neither do they halt disease progression in the ear. There is a significant unmet need for medical approaches that slow down or reverse hearing loss.


New research from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s College London has successfully reversed hearing loss in mice.

This proof-of-concept study suggests that gene therapy for this type of hearing loss in humans may be successful in the future.

Brain-computer interfaces are devices that allow for direct communication between the brain and external devices, such as computers or prosthetics. As significant investments flow into R&D, cutting-edge companies are gearing up for human trials. These trials aim to showcase and fine-tune the potential of these interfaces to treat conditions such as Parkinson’s disease, epilepsy and depression.

While these technologies’ immediate use is for treating conditions, they also have the potential to access vast information at unprecedented speeds. As it stands today, the field not only aims to aid recovery, but also enhance existing cognitive functions. These goals introduce various ethical and… More.


Can cutting-edge technology transform the way humans learn, remember and evolve?

Older adults with more severe behavioral symptoms, including agitation, aggression, and disinhibition, are more likely to become divorced than those with less severe symptoms. However, increasing stages of dementia are associated with a low likelihood of divorce. These are some of the conclusions of a new study published August 16 in the open-access journal PLOS ONE by Joan Monin of the Yale School of Public Health and colleagues.

In recent years, divorce has been on the rise among older adults. Moreover, can be difficult for married couples for many reasons, including the introduction of caregiving burden, loss of intimacy, and financial strain.

In a new study, researchers analyzed data from 37 NIA/NIH Alzheimer’s Disease Research Centers (ADRCs) across the US. The final study included 263 married or living-as– who were divorced or separated during their follow up period at an ADRC, as well as 1,238 age-matched controls.