Menu

Advisory Board

Michael Anissimov

Chapter six of The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil began with the following quote from Michael Anissimov

One of the biggest flaws in the common conception of the future is that the future is something that happens to us, not something we create.

He has also said

I cannot emphasize this enough. If an existential disaster occurs, not only will the possibilities of extreme life extension, sophisticated nanotechnology, intelligence enhancement, and space expansion never bear fruit, but everyone will be dead, never to come back. This would be awful. Because we have so much to lose, existential risk is worth worrying about even if our estimated probability of occurrence is extremely low.
 
It is not the funding of life extension research projects that immortalists should be focusing on. It should be projects that decrease the risk of existential risk. By default, once the probability of existential risk is minimized, life extension technologies will be developed and applied. There are powerful economic and social imperatives in that direction, but few towards risk management. Existential risk creates a ‘loafer problem’ — we always expect someone else to do it. I assert that this is a dangerous strategy and should be discarded in favor of making prevention of such risks a central focus.

Michael Anissimov writes and speaks on futurist issues, especially the relationships between accelerating change, nanotechnology, existential risk, transhumanism, and the Singularity. His popular blog Accelerating Future discusses these issues regularly.
 
Michael was a founding director of the nonprofit Immortality Institute, the first organization focused on the abolition of nonconsensual death. He is a member of the World Transhumanist Association, an associate of the Institute for Accelerating Change, and a member of the Center for Responsible Nanotechnology’s Global Task Force. Michael was Cofounder and Director of the Singularity Summit and was Media Director of the Machine Intelligence Research Institute.
 
His central fields of study outside of futurism are normative rationality, models for judgement under uncertainty, and heuristics and biases research. His Concise Introduction to Heuristics and Biases is the second Google result for the term.
 
He has authored More Dangers From Molecular Nanotechnology, What is Transhumanism?, 10 Simple Ways You Can Help the Technological Singularity, Deconstructing Asimov’s Laws, and Fountains of Youth: Hacking the Maximum Lifespan.
 
As a science and technology writer, Michael contributes to the Q&A website WiseGeek. He has also authored dozens of papers on transhumanism, futurism, and the Singularity. Over three hundred pages of his writing is online, much of which can be found at his personal website. Michael has been presenting on futurist issues since 2001, and has given talks to audiences at technology and philosophy conferences in San Francisco, Las Vegas, Los Angeles, and at Yale University.

A leading voice on the technological Singularity, Michael was quoted multiple times in Ray Kurzweil’s 2005 book The Singularity is Near: When Humans Transcend Biology. He lives in San Francisco, CA.
 
Read the transcript of his Future Blogger interview. Listen to Michael at the Singularity University, on FastForward Radio, and on The Future And You.