The search for the chemical origins of life represents a long-standing and continuously debated enigma. Despite its exceptional complexity, in the last decades the field has experienced a revival, also owing to the exponential growth of the computing power allowing for efficiently simulating the behavior of matter—including its quantum nature—under disparate conditions found, e.g., on the primordial Earth and on Earth-like planetary systems (i.e., exoplanets). In this minireview, we focus on some advanced computational methods capable of efficiently solving the Schrödinger equation at different levels of approximation (i.e., density functional theory)—such as ab initio molecular dynamics—and which are capable to realistically simulate the behavior of matter under the action of energy sources available in prebiotic contexts.
Whether this “complements or contradicts existing religious value systems depends largely on the interpretation of those systems by the people who have adopted them,” said Frank. “However, my interviews with astronauts of faith suggest that their religious perspective was strengthened, rather than being weakened.”
Frank notes that his cosmology has parallels with Yuval Harari ’s “dataism,” described by Harari as the “most interesting emerging religion.” Dataism, as defined by Harari, “says that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing.” This may sound kind of cold and metallic, but if life is an algorithm and self-awareness is data processing the parallels with Frank’s ideas are evident.
At the MVA webinar there wasn’t time to address all the points I wanted to discuss with Frank. So I had this new conversation with him. Please find below my comments and questions (all related to space philosophy, cosmic metaphysics, and religion), and listen to hear Frank’s thoughtful replies and other points that came up.
Every time a person dies, writes Russian novelist Vasily Grossman in Life and Fate, the entire world that has been built in that individual’s consciousness dies as well: “The stars have disappeared from the night sky; the Milky Way has vanished; the sun has gone out… flowers have lost their color and fragrance; bread has vanished; water has vanished.” Elsewhere in the book, he writes that one day we may engineer a machine that can have human-like experiences; but if we do, it will have to be enormous—so vast is this space of consciousness, even within the most “average, inconspicuous human being.”
And, he adds, “Fascism annihilated tens of millions of people.” Trying to think those two thoughts together is a near-impossible feat, even for the immense capacities of our consciousness. But will machine minds ever acquire anything like our ability to have such thoughts, in all their seriousness and depth? Or to reflect morally on events, or to equal our artistic and imaginative reach? Some think that this question distracts us from a more urgent one: we should be asking what our close relationship with our machines is doing to us.
Jaron Lanier, himself a pioneer of computer technology, warns in You Are Not a Gadget that we are allowing ourselves to become ever more algorithmic and quantifiable, because this makes us easier for computers to deal with. Education, for example, becomes less about the unfolding of humanity, which cannot be measured in units, and more about tick boxes.
The data centers that help train ChatGPT-like AI are very ‘thirsty,’ finds a new study.
A new study has uncovered how much water is consumed when training large AI models like OpenAI’s ChatGPT and Google’s Bard. The estimates of AI water consumption were presented by researchers from the Universities of Colorado Riverside and Texas Arlington in a pre-print article titled “Making AI Less ‘Thirsty.’”
Of course, the water used to cool these data centers doesn’t just disappear into the ether but is usually removed from water courses like rivers. The researchers distinguish between water “withdrawal” and “consumption” when estimating AI’s water usage.
Pp76/iStock.
In contrast to consumption, which relates mainly to water loss due to evaporation when used in data centers, the former involves physically removing water from a river, lake, or other sources. The consumption component of that equation, where the study claims “water cannot be recycled,” is where most of the study on AI’s water use is concentrated.
In 1918, the American chemist Irving Langmuir published a paper examining the behavior of gas molecules sticking to a solid surface. Guided by the results of careful experiments, as well as his theory that solids offer discrete sites for the gas molecules to fill, he worked out a series of equations that describe how much gas will stick, given the pressure.
Now, about a hundred years later, an “AI scientist” developed by researchers at IBM Research, Samsung AI, and the University of Maryland, Baltimore County (UMBC) has reproduced a key part of Langmuir’s Nobel Prize-winning work. The system— artificial intelligence (AI) functioning as a scientist—also rediscovered Kepler’s third law of planetary motion, which can calculate the time it takes one space object to orbit another given the distance separating them, and produced a good approximation of Einstein’s relativistic time-dilation law, which shows that time slows down for fast-moving objects.
A paper describing the results is published in Nature Communications on April 12.
As quantum advantage has been demonstrated on different quantum computing platforms using Gaussian boson sampling,1–3 quantum computing is moving to the next stage, namely demonstrating quantum advantage in solving practical problems. Two typical problems of this kind are computational-aided material design and drug discovery, in which quantum chemistry plays a critical role in answering questions such as ∼Which one is the best?∼. Many recent efforts have been devoted to the development of advanced quantum algorithms for solving quantum chemistry problems on noisy intermediate-scale quantum (NISQ) devices,2,4–14 while implementing these algorithms for complex problems is limited by available qubit counts, coherence time and gate fidelity. Specifically, without error correction, quantum simulations of quantum chemistry are viable only if low-depth quantum algorithms are implemented to suppress the total error rate. Recent advances in error mitigation techniques enable us to model many-electron problems with a dozen qubits and tens of circuit depths on NISQ devices,9 while such circuit sizes and depths are still a long way from practical applications.
The difference between the available and actually required quantum resources in practical quantum simulations has renewed the interest in divide and conquer (DC) based methods.15–19 Realistic material and (bio)chemistry systems often involve complex environments, such as surfaces and interfaces. To model these systems, the Schrödinger equations are much too complicated to be solvable. It therefore becomes desirable that approximate practical methods of applying quantum mechanics be developed.20 One popular scheme is to divide the complex problem under consideration into as many parts as possible until these become simple enough for an adequate solution, namely the philosophy of DC.21 The DC method is particularly suitable for NISQ devices since the sub-problem for each part can in principle be solved with fewer computational resources.15–18,22–25 One successful application of DC is to estimate the ground-state potential energy surface of a ring containing 10 hydrogen atoms using the density matrix embedding theory (DMET) on a trapped-ion quantum computer, in which a 20-qubit problem is decomposed into ten 2-qubit problems.18
DC often treats all subsystems at the same computational level and estimates physical observables by summing up the corresponding quantities of subsystems, while in practical simulations of complex systems, the particle–particle interactions may exhibit completely different characteristics in and between subsystems. Long-range Coulomb interactions can be well approximated as quasiclassical electrostatic interactions since empirical methods, such as empirical force filed (EFF) approaches,26 are promising to describe these interactions. As the distance between particles decreases, the repulsive exchange interactions from electrons having the same spin become important so that quantum mean-field approaches, such as Hartree–Fock (HF), are necessary to characterize these electronic interactions.
Space travel, exploration, and observation involve some of the most complex and dangerous scientific and technical operations ever carried out. This means that it tends to throw up the kinds of problems that artificial intelligence (AI) is proving itself to be outstandingly helpful with.
Because of this, astronauts, scientists, and others whose job it is to chart and explore the final frontier are increasingly turning to machine learning (ML) to tackle the everyday and extraordinary challenges they face.
AI is revolutionizing space exploration, from autonomous spaceflight to planetary exploration and charting the cosmos. ML algorithms help astronauts and scientists navigate and study space, avoid hazards, and classify features of celestial bodies.
Quantum computing promises to be a revolutionary tool, making short work of equations that classical computers would struggle to ever complete. Yet the workhorse of the quantum device, known as a qubit, is a delicate object prone to collapsing.
Keeping enough qubits in their ideal state long enough for computations has so far proved a challenge.
In a new experiment, scientists were able to keep a qubit in that state for twice as long as normal. Along the way, they demonstrated the practicality of quantum error correction (QEC), a process that keeps quantum information intact for longer by introducing room for redundancy and error removal.
A quantum computational solution for engineering materials. Researchers at Argonne explore the possibility of solving the electronic structures of complex molecules using a quantum computer. If you know the atoms that compose a particular molecule or solid material, the interactions between those atoms can be determined computationally, by solving quantum mechanical equations — at least, if the molecule is small and simple. However, solving these equations, critical for fields from materials engineering to drug design, requires a prohibitively long computational time for complex molecules and materials.
Models are scientific models, theories, hypotheses, formulas, equations, naïve models based on personal experiences, superstitions (!), and traditional computer programs. In a Reductionist paradigm, these Models are created by humans, ostensibly by scientists, and are then used, ostensibly by engineers, to solve real-world problems. Model creation and Model use both require that these humans Understand the problem domain, the problem at hand, the previously known shared Models available, and how to design and use Models. A Ph.D. degree could be seen as a formal license to create new Models[2]. Mathematics can be seen as a discipline for Model manipulation.
But now — by avoiding the use of human made Models and switching to Holistic Methods — data scientists, programmers, and others do not themselves have to Understand the problems they are given. They are no longer asked to provide a computer program or to otherwise solve a problem in a traditional Reductionist or scientific way. Holistic Systems like DNNs can provide solutions to many problems by first learning about the domain from data and solved examples, and then, in production, to match new situations to this gathered experience. These matches are guesses, but with sufficient learning the results can be highly reliable.
We will initially use computer-based Holistic Methods to solve individual and specific problems, such as self-driving cars. Over time, increasing numbers of Artificial Understanders will be able to provide immediate answers — guesses — to wider and wider ranges of problems. We can expect to see cellphone apps with such good command of language that it feels like talking to a competent co-worker. Voice will become the preferred way to interact with our personal AIs.