Toggle light / dark theme

Recent research suggests that a number of neuronal characteristics, traditionally believed to stem from the cell body or soma, may actually originate from processes in the dendrites. This discovery has significant implications for the study of degenerative diseases and for understanding the different states of brain activity during sleep and wakefulness.

The brain is an intricate network comprising billions of neurons. Each neuron’s cell body, or soma, engages in simultaneous communication with thousands of other neurons through its synapses. These synapses act as links, facilitating the exchange of information. Additionally, each neuron receives incoming signals through its dendritic trees, which are highly branched and extend for great lengths, resembling the structure of a complex and vast arboreal network.

For the last 75 years, a core hypothesis of neuroscience has been that the basic computational element of the brain is the neuronal soma, where the long and ramified dendritic trees are only cables that enable them to collect incoming signals from its thousands of connecting neurons. This long-lasting hypothesis has now been called into question.

Time is one of those things that most of us take for granted. We spend our lives portioning it into work-time, family-time, and me-time. Rarely do we sit and think about how and why we choreograph our lives through this strange medium. A lot of people only appreciate time when they have an experience that makes them realize how limited it is.

My own interest in time grew from one of those “time is running out” experiences. Eighteen years ago, while at university, I was driving down a country lane when another vehicle strayed onto my side of the road and collided with my car. I can still vividly remember the way in which time slowed down, grinding to a near halt, in the moments before my car impacted with the oncoming vehicle. Time literally seemed to stand still. The elasticity of time and its ability to wax and wane in different situations shone out like never before. From that moment I was hooked.

I have spent the last 15 years trying to answer questions such as: Why does time slow down in near-death situations? Does time really pass more quickly as you get older? How do our brains process time?

Finding ways to reduce the invasiveness of brain implants could greatly expand their potential applications. A new device tested in mice that sits on the brain’s surface—but can still read activity deep within—could lead to safer and more effective ways to read neural activity.

There are already a variety of technologies that allow us to peer into the inner workings of the brain, but they all come with limitations. Minimally invasive approaches include functional MRI, where an MRI scanner is used to image changes of blood flow in the brain, and EEG, where electrodes placed on the scalp are used to pick up the brain’s electrical signals.

The former requires the patient to sit in an MRI machine though, and the latter is too imprecise for most applications. The gold standard approach involves inserting electrodes deep into brain tissue to obtain the highest quality readouts. But this requires a risky surgical procedure, and scarring and the inevitable shifting of the electrodes can lead to the signal degrading over time.

Summary: The year 2023 witnessed groundbreaking discoveries in neuroscience, offering unprecedented insights into the human brain.

From animal-free brain organoids to the effects of optimism on cognitive skills, these top 10 articles have unveiled the mysteries of the mind.

Research revealed the risks of dietary trends like “dry scooping” and the impact of caffeine on brain plasticity. Additionally, the year showcased the potential of mushroom compounds for memory enhancement and the unexpected influence of virtual communication on brain activity.

In Neuromorphic Computing Part 2, we dive deeper into mapping neuromorphic concepts into chips built from silicon. With the state of modern neuroscience and chip design, the tools the industry is working with we’re working with are simply too different from biology. Mike Davies, Senior Principal Engineer and Director of Intel’s Neuromorphic Computing Lab, explains the process and challenge of creating a chip that can replicate some of the form and functions in biological neural networks.

Mike’s leadership in this specialized field allows him to share the latest insights from the promising future in neuromorphic computing here at Intel. Let’s explore nature’s circuit design of over a billion years of evolution and today’s CMOS semiconductor manufacturing technology supporting incredible computing efficiency, speed and intelligence.

Architecture All Access Season 2 is a master class technology series, featuring Senior Intel Technical Leaders taking an educational approach in explaining the historical impact and future innovations in their technical domains. Here at Intel, our mission is to create world-changing technology that improves the life of every person on earth. If you would like to learn more about AI, Wi-Fi, Ethernet and Neuromorphic Computing, subscribe and hit the bell to get instant notifications of new episodes.

Jump to Chapters:

Computer simulations of complex systems provide an opportunity to study their time evolution under user control. Simulations of neural circuits are an established tool in computational neuroscience. Through systematic simplification on spatial and temporal scales they provide important insights in the time evolution of networks which in turn leads to an improved understanding of brain functions like learning, memory or behavior. Simulations of large networks are exploiting the concept of weak scaling where the massively parallel biological network structure is naturally mapped on computers with very large numbers of compute nodes. However, this approach is suffering from fundamental limitations. The power consumption is approaching prohibitive levels and, more seriously, the bridging of time-scales from millisecond to years, present in the neurobiology of plasticity, learning and development is inaccessible to classical computers. In the keynote I will argue that these limitations can be overcome by extreme approaches to weak and strong scaling based on brain-inspired computing architectures.

Bio: Karlheinz Meier received his PhD in physics in 1984 from Hamburg University in Germany. He has more than 25years of experience in experimental particle physics with contributions to 4 major experiments at particle colliders at DESY in Hamburg and CERN in Geneva. For the ATLAS experiment at the Large Hadron Collider (LHC) he led a 15 year effort to design, build and operate an electronics data processing system providing on-the-fly data reduction by 3 orders of magnitude enabling among other achievements the discovery of the Higgs Boson. Following scientific staff positions at DESY and CERN he was appointed full professor of physics at Heidelberg university in 1992. In Heidelberg he co-founded the Kirchhoff-Institute for Physics and a laboratory for the development of microelectronic circuits for science experiments. In particle physics he took a leading international role in shaping the future of the field as president of the European Committee for Future Accelerators (ECFA). Around 2005 he gradually shifted his scientific interests towards large-scale electronic implementations of brain-inspired computer architectures. His group pioneered several innovations in the field like the conception of a description language for neural circuits (PyNN), time-compressed mixed-signal neuromorphic computing systems and wafer-scale integration for their implementation. He led 2 major European initiatives, FACETS and BrainScaleS, that both demonstrated the rewarding interdisciplinary collaboration of neuroscience and information science. In 2009 he was one of the initiators of the European Human Brain Project (HBP) that was approved in 2013. In the HBP he leads the subproject on neuromorphic computing with the goal of establishing brain-inspired computing paradigms as tools for neuroscience and generic methods for inference from large data volumes.

When young, these neurons signal fatty tissues to release energy fueling the brain. With age, the line breaks down. Fat cells can no longer orchestrate their many roles, and neurons struggle to pass information along their networks.

Using genetic and chemical methods, the team found a marker for these neurons—a protein called Ppp1r17 (catchy, I know). Changing the protein’s behavior in aged mice with genetic engineering extended their life span by roughly seven percent. For an average 76-year life span in humans, the increase translates to over five years.

The treatment also altered the mice’s health. Mice love to run, but their vigor plummets with age. Reactivating the neurons in elderly mice revived their motivation, transforming them from couch potatoes into impressive joggers.

In the fifth decade of life, our brains start to undergo a radical “rewiring” that results in diverse networks becoming more integrated over the ensuing decades. ⁠ https://bigthink.com/neuropsych/great-brain-rewiring-after-age-40/ Big Think.


In a systematic review published last year in the journal Psychophysiology, researchers from Monash University in Australia swept through the scientific literature, seeking to summarize how the connectivity of the human brain changes over our lifetimes. The gathered evidence suggests that in the fifth decade of life (that is, after a person turns 40), the brain starts to undergo a radical “rewiring” that results in diverse networks becoming more integrated and connected over the ensuing decades, with accompanying effects on cognition.

Since the turn of the century, neuroscientists have increasingly viewed the brain as a complex network, consisting of units broken down into regions, sub-regions, and individual neurons. These units are connected structurally, functionally, or both. With increasingly advanced scanning techniques, neuroscientists can observe the parts of subjects’ brains that “light up” in response to stimuli or when simply at rest, providing a superficial look at how our brains are synced up.

The Monash University team pored over 144 studies that used these imaging techniques to probe the brains of tens of thousands of subjects. From this analysis, the researchers gleaned a general trend in how the networked brain changes over our lifetimes.

Research indicates enhanced mental function in individuals who maintain an active lifestyle and engage in social interactions, alongside managing blood pressure and diabetes effectively.

As federal approval for more Alzheimer’s disease medications progresses, a recent study conducted by UC San Francisco and Kaiser Permanente Washington reveals that tailored health and lifestyle modifications can postpone or prevent memory deterioration in older adults at increased risk.

The two-year study compared cognitive scores, risk factors, and quality of life among 172 participants, of whom half had received personalized coaching to improve their health and lifestyle in areas believed to raise the risk of Alzheimer’s, such as uncontrolled diabetes and physical inactivity. These participants were found to experience a modest boost in cognitive testing, amounting to a 74% improvement over the non-intervention group.

Cognitive neuroscientist Clayton Curtis describes an elegant experiment that leads us to ask: Does the brain honor the distinction implied in most textbooks between spatial attention, motor control, and spatial working memory?

For more info/content, please visit: https://postlab.psych.wisc.edu/cog-ne

Relevant paper:
Jerde, T. A., Merriam, E. P., Riggall, A. C., Hedges, J. H., \& Curtis, C. E. (2012). Prioritized maps of space in human frontoparietal cortex. Journal of Neuroscience, 32(48), 17382–17390.