Toggle light / dark theme

Can space and time emerge from simple rules? Wolfram thinks so

Stephen Wolfram joins Brian Greene to explore the computational basis of space, time, general relativity, quantum mechanics, and reality itself.

This program is part of the Big Ideas series, supported by the John Templeton Foundation.

Participant: Stephen Wolfram.
Moderator: Brian Greene.

0:00:00 — Introduction.
01:23 — Unifying Fundamental Science with Advanced Mathematical Software.
13:21 — Is It Possible to Prove a System’s Computational Reducibility?
24:30 — Uncovering Einstein’s Equations Through Software Models.
37:00 — Is connecting space and time a mistake?
49:15 — Generating Quantum Mechanics Through a Mathematical Network.
01:06:40 — Can Graph Theory Create a Black Hole?
01:14:47 — The Computational Limits of Being an Observer.
01:25:54 — The Elusive Nature of Particles in Quantum Field Theory.
01:37:45 — Is Mass a Discoverable Concept Within Graph Space?
01:48:50 — The Mystery of the Number Three: Why Do We Have Three Spatial Dimensions?
01:59:15 — Unraveling the Mystery of Hawking Radiation.
02:10:15 — Could You Ever Imagine a Different Career Path?
02:16:45 — Credits.

VISIT our Website: http://www.worldsciencefestival.com.
FOLLOW us on Social Media:
Facebook: / worldsciencefestival.
Twitter: / worldscifest.
Instagram: https://www.instagram.com/worldscifest/
TikTok: https://www.tiktok.com/@worldscifest.
LinkedIn: https://www.linkedin.com/company/world-science-festival.
#worldsciencefestival #briangreene #cosmology #astrophysics

How AI & Supercomputing Are Reshaping Aerospace & Finance w/ Allan Grosvenor (MSBAI)

Excellent Podcast interview Allan Grosvenor!…” How Allan built MSBAI to make super computing more accessible.

How AI-driven simulation is speeding up aircraft & spacecraft design.

Why AI is now making an impact in finance & algorithmic trading.

The next evolution of AI-powered decision-making & autonomous systems”


What if AI could power everything from rocket simulations to Wall Street trading? Allan Grosvenor, aerospace engineer and founder of MSBAI, has spent years developing AI-driven supercomputing solutions for space, aviation, defense, and even finance. In this episode, Brent Muller dives deep with Allan on how AI is revolutionizing engineering, the role of supercomputers in aerospace, and why automation is the key to unlocking faster innovation.

From spin glasses to quantum codes: Researchers develop optimal error correction algorithm

Scientists have developed an exact approach to a key quantum error correction problem once believed to be unsolvable, and have shown that what appeared to be hardware-related errors may in fact be due to suboptimal decoding.

The , called PLANAR, achieved a 25% reduction in logical error rates when applied to Google Quantum AI’s experimental data. This discovery revealed that a quarter of what the tech giant attributed to an “error floor” was actually caused by their decoding method, rather than genuine hardware limitations.

Quantum computers are extraordinarily sensitive to errors, making essential for practical applications.

Light-based computing with optical fibers shows potential for ultra-fast AI systems

Imagine a computer that does not rely only on electronics but uses light to perform tasks faster and more efficiently. A collaboration between two research teams from Tampere University in Finland and Université Marie et Louis Pasteur in France have now demonstrated a novel way of processing information using light and optical fibers, opening up the possibility of building ultra-fast computers. The studies are published in Optics Letters and on the arXiv preprint server.

The research was performed by postdoctoral researchers Dr. Mathilde Hary from Tampere University and Dr. Andrei Ermolaev from the Université Marie et Louis Pasteur, Besançon, demonstrated how inside thin glass fibers can mimic the way artificial intelligence (AI) processes information. Their work has investigated a particular class of computing architecture known as an Extreme Learning Machine, an approach inspired by neural networks.

“Instead of using conventional electronics and algorithms, computation is achieved by taking advantage of the nonlinear interaction between intense light pulses and the glass,” Hary and Ermolaev explain.

Light Squeezed Out of Darkness in Surprising Quantum Simulation

A careful alignment of three powerful lasers could generate a mysterious fourth beam of light that is throttled out of the very darkness itself.

What sounds like occult forces at work has been confirmed by a simulation of the kinds of quantum effects we might expect to emerge from a vacuum when ultra-high electromagnetic fields meet.

A team of researchers from the University of Oxford in the UK and the University of Lisbon in Portugal used a semi-classical equation solver to simulate quantum phenomena in real time and in three dimensions, testing predictions on what ought to occur when incredibly intense laser pulses combine in empty space.

Simulation reveals emergence of jet from binary neutron star merger followed by black hole formation

Binary neutron star mergers, cosmic collisions between two very dense stellar remnants made up predominantly of neutrons, have been the topic of numerous astrophysics studies due to their fascinating underlying physics and their possible cosmological outcomes. Most previous studies aimed at simulating and better understanding these events relied on computational methods designed to solve Einstein’s equations of general relativity under extreme conditions, such as those that would be present during neutron star mergers.

Researchers at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Yukawa Institute for Theoretical Physics, Chiba University, and Toho University recently performed the longest simulation of binary neutron star mergers to date, utilizing a framework for modeling the interactions between magnetic fields, high-density matter and neutrinos, known as the neutrino-radiation magnetohydrodynamics (MHD) framework.

Their simulation, outlined in Physical Review Letters, reveals the emergence of a magnetically dominated jet from the , followed by the collapse of the binary neutron star system into a black hole.

Information Processing via Human Soft Tissue: Soft Tissue Reservoir Computing

Physical reservoir computing refers to the concept of using nonlinear physical systems as computational resources to achieve complex information processing. This approach exploits the intrinsic properties of physical systems such as their nonlinearity and memory to perform computational tasks. Soft biological tissues possess characteristics such as stress-strain nonlinearity and viscoelasticity that satisfy the requirements of physical reservoir computing. This study evaluates the potential of human soft biological tissues as physical reservoirs for information processing. Particularly, it determines the feasibility of using the inherent dynamics of human soft tissues as a physical reservoir to emulate nonlinear dynamic systems. In this concept, the deformation field within the muscle, which is obtained from ultrasound images, represented the state of the reservoir. The findings indicate that the dynamics of human soft tissue have a positive impact on the computational task of emulating nonlinear dynamic systems. Specifically, our system outperformed the simple LR model for the task. Simple LR models based on raw inputs, which do not account for the dynamics of soft tissue, fail to emulate the target dynamical system (relative error on the order of <inline-formula xmlns:mml=“http://www.w3.org/1998/Math/MathML” xmlns:xlink=“http://www.w3.org/1999/xlink”> <tex-math notation=“LaTeX”>$10^{-2}$ </tex-math></inline-formula>). By contrast, the emulation results obtained using our system closely approximated the target dynamics (relative error on the order of <inline-formula xmlns:mml=“http://www.w3.org/1998/Math/MathML” xmlns:xlink=“http://www.w3.org/1999/xlink”> <tex-math notation=“LaTeX”>$10^{-3}$ </tex-math></inline-formula>). These results suggest that the soft tissue dynamics contribute to the successful emulation of the nonlinear equation. This study suggests that human soft tissues can be used as a potential computational resource. Soft tissues are found throughout the human body. Therefore, if computational processing is delegated to biological tissues, it could lead to a distributed computation system for human-assisted devices.

Algorithm streamlines vascular system design for 3D printed hearts

There are more than 100,000 people on organ transplant lists in the U.S., some of whom will wait years to receive one—and some may not survive the wait. Even with a good match, there is a chance that a person’s body will reject the organ. To shorten waiting periods and reduce the possibility of rejection, researchers in regenerative medicine are developing methods to use a patient’s own cells to fabricate personalized hearts, kidneys, livers, and other organs on demand.

Ensuring that oxygen and nutrients can reach every part of a newly grown organ is an ongoing challenge. Researchers at Stanford have created new tools to design and 3D print the incredibly complex vascular trees needed to carry blood throughout an organ. Their platform, published June 12 in Science, generates designs that resemble what we actually see in the human body significantly faster than previous attempts and is able to translate those designs into instructions for a 3D printer.

“The ability to scale up bioprinted tissues is currently limited by the ability to generate vasculature for them—you can’t scale up these tissues without providing a ,” said Alison Marsden, the Douglas M. and Nola Leishman Professor of Cardiovascular Diseases, professor of pediatrics and of bioengineering at Stanford in the Schools of Engineering and Medicine and co-senior author on the paper. “We were able to make the algorithm for generating the vasculature run about 200 times faster than prior methods, and we can generate it for complex shapes, like organs.”