Toggle light / dark theme

Year 2023 face_with_colon_three


Konishi, S., Ishibashi, S., Shimizu, S. et al. Sci Rep 13, 11,519 (2023). https://doi.org/10.1038/s41598-023-38522-x.

Download citation.

The precise nature of the engram, the physical substrate of memory, remains uncertain. Here, it is reported that RNA extracted from the central nervous system of Aplysia given long-term sensitization (LTS) training induced sensitization when injected into untrained animals; furthermore, the RNA-induced sensitization, like training-induced sensitization, required DNA methylation. In cellular experiments, treatment with RNA extracted from trained animals was found to increase excitability in sensory neurons, but not in motor neurons, dissociated from naïve animals. Thus, the behavioral, and a subset of the cellular, modifications characteristic of a form of nonassociative long-term memory (LTM) in Aplysia can be transferred by RNA. These results indicate that RNA is sufficient to generate an engram for LTS in Aplysia and are consistent with the hypothesis that RNA-induced epigenetic changes underlie memory storage in Aplysia.

Now this is the sort of application of AI that really intrigues me. Researchers have developed DolphinGemma, the first large language model (LLM) for understanding dolphin language. It could help us translate what these incredible creatures are saying, potentially much faster than we ever could with manual approaches used over several decades.

“The goal would be to one day speak Dolphin,” says Dr. Denise Herzing. Her research organization, The Wild Dolphin Project (WDP), exclusively studies a specific pod of free-ranging Atlantic spotted dolphins who reside off the coast of the Bahamas.

She’s been collecting and organizing dolphin sounds for the last 40 years, and has been working with Dr. Thad Starner, a research scientist from Google DeepMind, an AI subsidiary of the tech giant.

Year 2021 face_with_colon_three


Communication between brain activity and computers, known as brain-computer interface or BCI, has been used in clinical trials to monitor epilepsy and other brain disorders. BCI has also shown promise as a technology to enable a user to move a prosthesis simply by neural commands. Tapping into the basic BCI concept would make smart phones smarter than ever.

Research has zeroed in on retrofitting wireless earbuds to detect neural signals. The data would then be transmitted to a smartphone via Bluetooth. Software at the smartphone end would translate different brain wave patterns into commands. The emerging technology is called Ear EEG.

Rikky Muller, Assistant Professor of Electrical Engineering and Computer Science, has refined the physical comfort of EEG earbuds and has demonstrated their ability to detect and record brain activity. With support from the Bakar Fellowship Program, she is building out several applications to establish Ear EEG as a new platform technology to support consumer and health monitoring apps.