Toggle light / dark theme

Why the Future of Intelligence Is Already Here | Alex Wissner-Gross | TEDxBoston

The future of intelligence is rapidly evolving with AI advancements, poised to transform numerous aspects of life, work, and existence, with exponential growth and sweeping changes expected in the near future.

## Questions to inspire discussion.

Strategic Investment & Career Focus.

🎯 Q: Which companies should I prioritize for investment or career opportunities in the AI era?

A: Focus on companies with the strongest AI models and those advancing energy abundance, as these will have the largest marginal impact on enabling the innermost loop of robots building fabs, chips, and AI data centers to accelerate exponentially.

Understanding Market Dynamics.

Physicists challenge a 200-year-old law of thermodynamics at the atomic scale

A long-standing law of thermodynamics turns out to have a loophole at the smallest scales. Researchers have shown that quantum engines made of correlated particles can exceed the traditional efficiency limit set by Carnot nearly 200 years ago. By tapping into quantum correlations, these engines can produce extra work beyond what heat alone allows. This could reshape how scientists design future nanoscale machines.

Two physicists at the University of Stuttgart have demonstrated that the Carnot principle, a foundational rule of thermodynamics, does not fully apply at the atomic scale when particles are physically linked (so-called correlated objects). Their findings suggest that this long-standing limit on efficiency breaks down for tiny systems governed by quantum effects. The work could help accelerate progress toward extremely small and energy-efficient quantum motors. The team published its mathematical proof in the journal Science Advances.

Traditional heat engines, such as internal combustion engines and steam turbines, operate by turning thermal energy into mechanical motion, or simply converting heat into movement. Over the past several years, advances in quantum mechanics have allowed researchers to shrink heat engines to microscopic dimensions.

AI Discovers Geophysical Turbulence Model

One of the biggest challenges in climate science and weather forecasting is predicting the effects of turbulence at spatial scales smaller than the resolution of atmospheric and oceanic models. Simplified sets of equations known as closure models can predict the statistics of this “subgrid” turbulence, but existing closure models are prone to dynamic instabilities or fail to account for rare, high-energy events. Now Karan Jakhar at the University of Chicago and his colleagues have applied an artificial-intelligence (AI) tool to data generated by numerical simulations to uncover an improved closure model [1]. The finding, which the researchers subsequently verified with a mathematical derivation, offers insights into the multiscale dynamics of atmospheric and oceanic turbulence. It also illustrates that AI-generated prediction models need not be “black boxes,” but can be transparent and understandable.

The team trained their AI—a so-called equation-discovery tool—on “ground-truth” data that they generated by performing computationally costly, high-resolution numerical simulations of several 2D turbulent flows. The AI selected the smallest number of mathematical functions (from a library of 930 possibilities) that, in combination, could reproduce the statistical properties of the dataset. Previously, researchers have used this approach to reproduce only the spatial structure of small-scale turbulent flows. The tool used by Jakhar and collaborators filtered for functions that correctly represented not only the structure but also energy transfer between spatial scales.

They tested the performance of the resulting closure model by applying it to a computationally practical, low-resolution version of the dataset. The model accurately captured the detailed flow structures and energy transfers that appeared in the high-resolution ground-truth data. It also predicted statistically rare conditions corresponding to extreme-weather events, which have challenged previous models.

Michael Levin: Novel Embodiments of Mind: Natural, Bioengineered, and Hybrid Interfaces

This is an invited talk in BAMΞ’s Mathematical Phenomenology Sprint.
Cf. https://bamxi.org/research-activities/mathematical-phenomenology-sprint/

Organizing Institutions:
Bamberg Mathematical Consciousness Science Initiative (BAMΞ) https://bamxi.org.
& Association for Mathematical Consciousness Science (AMCS) https://amcs-community.org

Seeing the Quantum Butterfly Effect

A combined experimental and theoretical study reveals the emergence of quantum chaos in a complex system, suggesting that it can be described with a universal theoretical framework.

Consider the following thought experiment: Take all the air molecules in a thunderstorm and evolve them backward in time for an hour, effectively rewinding a molecular movie. Then slightly perturb the velocity directions of a few molecules and evolve the system forward again to the current moment. Because such systems are chaotic, microscopic perturbations in the past will lead to dramatically different futures. This “butterfly effect” also occurs in quantum systems. To observe it, researchers measure a mathematical entity called the out-of-time-ordered correlator (OTOC). Loosely speaking, the OTOC measures how quickly a system “forgets” its initial state. Unfortunately, the OTOC is notoriously difficult to measure because it typically requires experimental protocols that implement an effective many-body time reversal.

Leading AI models struggle to solve original math problems

Mathematics, like many other scientific endeavors, is increasingly using artificial intelligence. Of course, math is the backbone of AI, but mathematicians are also turning to these tools for tasks like literature searches and checking manuscripts for errors. But how well can AI perform when it comes to solving genuine, high-level research problems?

To date, there is still no widely accepted realistic methodology for assessing AI’s capabilities to solve math at this level. So a group of mathematicians decided to put the machines to the test as they detail in a study available on the arXiv preprint server.

Previous attempts at testing AI have used math contest problems and questions already found in textbooks. What makes this study different is that the questions the programs faced were drawn from mathematicians’ own research. They had never been posted or published online, which means AI couldn’t memorize answers from its training data.

Seeing the whole from a part: Revealing hidden turbulent structures from limited observations and equations

The irregular, swirling motion of fluids we call turbulence can be found everywhere, from stirring in a teacup to currents in the planetary atmosphere. This phenomenon is governed by the Navier-Stokes equations—a set of mathematical equations that describe how fluids move.

Despite being known for nearly two centuries, these equations still pose major challenges when it comes to making predictions. Turbulent flows are inherently chaotic, and tiny uncertainties can grow quickly over time.

In real-world situations, scientists can only observe part of a turbulent flow, usually its largest and slowest moving features. Thus, a long-standing question in fluid physics has been whether these partial observations are enough to reconstruct the full motion of the fluid.

Mathematics for Computer Science

This course covers elementary discrete mathematics for computer science and engineering. It emphasizes mathematical definitions and proofs as well as applicable methods. Topics include formal logic notation, proof methods; induction, well-ordering; sets, relations; elementary graph theory; integer congruences; asymptotic notation and growth of functions; permutations and combinations, counting principles; discrete probability. Further selected topics may also be covered, such as recursive definition and structural induction; state machines and invariants; recurrences; generating functions.

/* */