Toggle light / dark theme

Want to stream more content like this… and 1,000’s of courses, documentaries & more?

👉 👉 Start Your Free Trial of Wondrium https://tinyurl.com/jhj7xbxd 👈 👈

It’s said that the clock is always ticking, but there’s a chance that it isn’t. The theory of “presentism” states that the current moment is the only thing that’s real, while “eternalism” is the belief that all existence in time is equally real. Find out if the future is really out there and predictable—just don’t tell us who wins the big game next year.

This video is episode two from the series “Mysteries of Modern Physics: Time”, Presented by Sean Carroll.

An interview with J. Storrs Hall, author of the epic book “Where is My Flying Car — A Memoir of Future Past”: “The book starts as an examination of the technical limitations of building flying cars and evolves into an investigation of the scientific, technological, and social roots of the economic…


J. Storrs Hall or Josh is an independent researcher and author.

He was the founding Chief Scientist of Nanorex, which is developing a CAD system for nanomechanical engineering.

Physicist Federico Faggin is none other than the inventor of both the microprocessor and silicon gate technology, which spawned the explosive progress in computer technology we have witnessed over the past five decades. He is also probably the world’s most well rounded idealist alive. Mr. Faggin approaches idealism from both a deeply technical and a deeply personal, experiential perspective. In this interview, Essentia Foundation’s Natalia Vorontsova engages in an open, free-ranging but very accessible conversation with Mr. Faggin.

Copyright © 2022 by Essentia Foundation. All rights reserved.
https://www.essentiafoundation.org.

Thumbnail inspiration image: Vecteezy.com

Last year, I edhost a thrilling conversation between @SabineHossenfelder, Carlo Rovelli, and Eric Weinstein as they debate quantum physics, consciousness and the mystery of reality. \
\
IAI Live is a monthly event featuring debates, talks, interviews, documentaries and music. LIVE. \
Watch the original video \
\
See the world’s leading thinkers debate the big questions for real, LIVE in London. Tickets: https://howthelightgetsin.org/\
\
To discover more talks, debates, interviews and academies with the world’s leading speakers visit https://iai.tv/\
\
We imagine physics is objective. But quantum physics found the act of human observation changes the outcome of experiment. Many scientists assume this central role of the observer is limited to just quantum physics. But is this an error? As Heisenberg puts it, \.

Advancements in deep learning have influenced a wide variety of scientific and industrial applications in artificial intelligence. Natural language processing, conversational AI, time series analysis, and indirect sequential formats (such as pictures and graphs) are common examples of the complicated sequential data processing jobs involved in these. Recurrent Neural Networks (RNNs) and Transformers are the most common methods; each has advantages and disadvantages. RNNs have a lower memory requirement, especially when dealing with lengthy sequences. However, they can’t scale because of issues like the vanishing gradient problem and training-related non-parallelizability in the time dimension.

As an effective substitute, transformers can handle short-and long-term dependencies and enable parallelized training. In natural language processing, models like GPT-3, ChatGPT LLaMA, and Chinchilla demonstrate the power of Transformers. With its quadratic complexity, the self-attention mechanism is computationally and memory-expensive, making it unsuitable for tasks with limited resources and lengthy sequences.

A group of researchers addressed these issues by introducing the Acceptance Weighted Key Value (RWKV) model, which combines the best features of RNNs and Transformers while avoiding their major shortcomings. While preserving the expressive qualities of the Transformer, like parallelized training and robust scalability, RWKV eliminates memory bottleneck and quadratic scaling that are common with Transformers. It does this with efficient linear scaling.

A central challenge for systems neuroscience and artificial intelligence is to understand how cognitive behaviors arise from large, highly interconnected networks of neurons. Digital simulation is linking cognitive behavior to neural activity to bridge this gap in our understanding at great expense in time and electricity. A hybrid analog-digital approach, whereby slow analog circuits, operating in parallel, emulate graded integration of synaptic currents by dendrites while a fast digital bus, operating serially, emulates all-or-none transmission of action potentials by axons, may improve simulation efficacy. Due to the latter’s serial operation, this approach has not scaled beyond millions of synaptic connections (per bus). This limit was broken by following design principles the neocortex uses to minimize its wiring. The resulting hybrid analog-digital platform, Neurogrid, scales to billions of synaptic connections, between up to a million neurons, and simulates cortical models in real-time using a few watts of electricity. Here, we demonstrate that Neurogrid simulates cortical models spanning five levels of experimental investigation: biophysical, dendritic, neuronal, columnar, and area. Bridging these five levels with Neurogrid revealed a novel way active dendrites could mediate top-down attention.

K.B. and N.N.O. are co-founders and equity owners of Femtosense Inc.