Toggle light / dark theme

Heninger and Johnson argue that chaos theory limits the capacity of even superintelligent systemsin simulating/predicting events within our universe, implying that Yudkowskian warnings may not accurately describe the nature of ASI risk. #aisafety #ai #tech


Completely eliminating the uncertainty would require making measurements with perfect precision, which does not seem to be possible in our universe. We can prove that fundamental sources of uncertainty make it impossible to know important things about the future, even with arbitrarily high intelligence. Atomic scale uncertainty, which is guaranteed to exist by Heisenberg’s Uncertainty Principle, can make macroscopic motion unpredictable in a surprisingly short amount of time. Superintelligence is not omniscience.

Chaos theory thus allows us to rigorously show that there are ceilings on some particular abilities. If we can prove that a system is chaotic, then we can conclude that the system offers diminishing returns to intelligence. Most predictions of the future of a chaotic system are impossible to make reliably. Without the ability to make better predictions, and plan on the basis of these predictions, intelligence becomes much less useful.

This does not mean that intelligence becomes useless, or that there is nothing about chaos which can be reliably predicted.

While these earthquakes are not impacting Kilauea, one of the world’s most active volcanoes, it has been intermittently erupting since Dec. 23.

According to the latest update from the Hawaiian Volcano Observatory, the intensity of these eruptions, so-called “lava fountaining,” has strengthened after weakening in the days after the initial eruption.

Webcam observations by USGS scientists show that these eruptions are currently confined to its crater.