Blog

Jan 10, 2008

Poll: Top 10 Existential Risks

Posted by in category: existential risks

How would you allocate a hypothetical $100 million budget for a Lifeboat Foundation study of the top 10 existential risks… risks that are both global and terminal?

$?? Biological viruses…
$?? Environmental global warming…
$?? Extraterrestrial invasion…
$?? Governments abusive power…
$?? Nanotechnology gray goo…
$?? Nuclear holocaust…
$?? Simulation Shut Down if we live in one…
$?? Space Threats asteroids…
$?? Superintelligent AI un-friendly…
$?? Other
$100 million total

To vote, please reply below.

Results after 80 votes updated: Jan 13, 2008 11 AM EST

$23.9 Biological viruses…
$17.9 Space Threats asteroids…
$13.9 Governments abusive power…
$10.2 Nuclear holocaust…
$8.8 Nanotechnology gray goo…
$8.6 Other
$8.5 Superintelligent AI un-friendly…
$7.2 Environmental global warming…
$0.7 Extraterrestrial invasion…
$0.4 Simulation Shut Down if we live in one…
$100 million total

121

Comments — comments are now closed.

  • D on January 12, 2008 8:56 pm

    $100mil?

    you’re joking right? That’s not really enough to do… anything. Except MAYBE print the materials and advertizing to tell people not to worry.

    The CDC’s budget alone is $9 billion per year… They spend $1.5 billion on the terrorist research arm per year, and that doesn’t cover everything…
    CDC Budget Detail 2006-07

    $100mil just doesn’t get you very far…

  • JH on January 12, 2008 9:45 pm

    $40 Biological viruses…
    $00 Environmental global warming…
    $00 Extraterrestrial invasion…
    $00 Governments abusive power…
    $10 Nanotechnology gray goo…
    $05 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $05 Space Threats asteroids…
    $00 Superintelligent AI un-friendly…
    $40 Other

  • David Lawler on January 13, 2008 2:48 am

    $30 Biological viruses…
    $5 Environmental global warming…
    $2 Extraterrestrial invasion…
    $3 Governments abusive power…
    $8 Nanotechnology gray goo…
    $20 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $13 Space Threats asteroids…
    $8 Superintelligent AI un-friendly…
    $11 Other

  • […] I didn’t sleep at all last night, worrying about Existential Risks. […]

  • Bruce Klein on January 13, 2008 8:50 am

    100+ replies and 80 votes… thanks guys!

    my vote:

    70 — Superintelligent AI un-friendly…
    10 — Biological viruses…
    10 — Nanotechnology gray goo…
    4 — Nuclear holocaust…
    2 — Space Threats asteroids…
    1 — Environmental global warming…
    1 — Extraterrestrial invasion…
    1 — Governments abusive power…
    1 — Simulation Shut Down if we live in one…

  • Christopher Moore on January 13, 2008 4:15 pm

    Since the allocated $100 Million will be used to research the problem instead of fixing it, my list does not reflect what I feel to be the greatest dangers facing mankind — only the dangers in which a proper current threat assessment is lacking.

    My list also takes into consideration the possiblilty of an attainable remedy given any amount of funding, no matter how large. (No amount of funding can prevent an alien attack. Also, if this research is partially government funded, then no amount of money will expose or expulse its corrupted members — that is if the initial research is even allowed).

    Many of the available options could possibly fall under the heading of “governments’ abuse of powers,” (biological, technological, nuclear) and the research results could be corrupted if the threat is intentional.

    60 — Biological viruses
    10 — Cyber-terrorism
    10 — Nuclear holocaust
    5 — Environmental global warming
    5 — Space Threats asteroids
    5 — Nanotechnology gray goo Cyber-terrorism
    3 — Other (Pole Shift or increase in solar activity)…
    2 — Superintelligent AI un-friendly

    We should guard against lower-tech man-made threats first, as those are the best understood & most abused. Then, guard against natural phenomena currently beyond our control and high tech threats (not from abuse but from mis-use).

    Governments’ abuse of power would be at the top of my list, however that’s not only the most difficult to research & remedy, but it’s also impossible for some to take seriously. It is, by far, my highest personal concern given the current state of the world.

    If I was persoanlly given $100 million I would allocated the funds into the following 2 (non-listed) categories:

    50 — Find ways to realistically reduce the chaos & fear leading up to the Singularity. Many average people are confused and finding it hard to deal with the accelerating change.

    50 — Search for a replacement, clean, low-cost power source to replace oil.

    To cope with the coming whirlwind of changes, the economy, technology, society, our individual & collective mind will need a serious overhaul. Prioritize and get prepared locally before the global change. The Singularity will be a brick wall for some and a stariway to heaven for others. It’s our choice, and failing to be prepared is the absolute greatest existential danger facing mankind.

  • Laurie Marquardt on January 13, 2008 5:43 pm

    $12 Biological
    $75 Environmental
    $00 Extraterrestrial
    $00 Governments
    $13 Nanotechnology
    $00 Nuclear
    $00 Simulation Shut Down
    $00 Space Threats
    $00 Superintelligent AI
    $00 Other
    $100 million total

    I think that those are the most immediate threats, and so they should be taken care of ASAP.

  • robomoon on January 16, 2008 1:00 pm

    $- Biological viruses…
    $- Environmental global warming…
    $- Extraterrestrial invasion…
    $- Governments abusive power…
    $- Nanotechnology gray goo…
    $- Nuclear holocaust…
    $- Simulation Shut Down if we live in one…
    $- Space Threats asteroids…
    $- Superintelligent AI un-friendly…
    $100 Other: Better living standards to enable better strategies for family planning. Overpopulation of humans who made the very fast mass extinction of plants & animals during the 20st Century and these years of the 21st Century possible. Humans who destroy the ecology on this planet in waste quantities are an actual problem which must be taken care of now!

  • Crispin on January 21, 2008 8:25 am

    $70 Governments abusive power… — politics is the perennial existential threat
    $8 Biological viruses… — monoculture and uniformity make this highly dangerous
    $8 Environmental global warming… — see above
    $8 Other — horizon scanning is important
    $4 Nuclear holocaust… — has the potential to occur
    $1 Space Threats asteroids… — not a lot we can do about it, but interesting
    $0.25 Nanotechnology gray goo… — highly unlikely
    $0.25 Superintelligent AI un-friendly… — highly unlikely
    $0.25 Extraterrestrial invasion… — very little forward notice we could get of this…
    $0.25 Simulation Shut Down if we live in one… — very poor sci-fi

  • Peter de Blanc on January 29, 2008 7:33 pm

    $33 Superintelligent AI — should be enough to fund SIAI indefinitely, train a team of FAI theorists, and support others such as Nick Hay and Rolf Nelson. Try to redirect people working on UFAI towards something else
    $23 Biological viruses
    $22 Nanotechnology — hire people currently working on MNT to do life extension research instead
    $16 Other — including general existential risks research (like Bostrom’s work), good ethics research, teaching people about rationality
    $2 Nuclear — health food, Yoga classes, tempur-pedic mattresses, and masseuses for the people in charge of launching missiles (reduce stress levels)
    $2 Governments — distract any politicians who are trying to hinder something we’re working on
    $1 Space Threats — mostly astronomy
    $1 Simulation Shut Down — not as silly as people think, and some good research might come out of it
    $0 Extraterrestrial invasion — this is as silly as people think
    $0 Environmental — lots of money is being spent on this already

  • Peter de Blanc on January 30, 2008 11:40 pm

    Let me revise the above. I had $23 left when I was done, so I threw it at the biological viruses. That was silly. Add the $23 to ‘Superintelligent AI’; SIAI could extend its talent search, perhaps by creating new courses at universities.

  • Edward Greisch on February 12, 2008 9:40 pm

    Hydrogen Sulfide gas will Kill all people. Homo Sap will go
    EXTINCT unless drastic action is taken.

    October 2006 Scientific American

    “EARTH SCIENCE
    Impact from the Deep
    Strangling heat and gases emanating from the earth and sea, not
    asteroids, most likely caused several ancient mass extinctions.
    Could the same killer-greenhouse conditions build once again?
    By Peter D. Ward
    downloaded from:
    http://www.sciam.com/
    article.cfm?articleID=
    00037A5D-A938-150E–
    A93883414B7F0000&
    sc=I100322
    .….….….….…Most of the article omitted.….….….….…..
    But with atmospheric carbon climbing at an annual rate of 2 ppm
    and expected to accelerate to 3 ppm, levels could approach 900
    ppm by the end of the next century, and conditions that bring
    about the beginnings of ocean anoxia may be in place. How soon
    after that could there be a new greenhouse extinction? That is
    something our society should never find out.”

    Press Release
    Pennsylvania State University
    FOR IMMEDIATE RELEASE
    Monday, Nov. 3, 2003
    downloaded from:
    http://www.geosociety.org/meetings/2003/prPennStateKump.htm
    “In the end-Permian, as the levels of atmospheric oxygen fell and
    the levels of hydrogen sulfide and carbon dioxide rose, the upper
    levels of the oceans could have become rich in hydrogen sulfide
    catastrophically. This would kill most of the oceanic plants and
    animals. The hydrogen sulfide dispersing in the atmosphere would
    kill most terrestrial life.”

    http://www.astrobio.net is a NASA web zine. See:

    http://www.astrobio.net/
    news/modules.php?op=
    modload&name=News&
    file=article&sid=672

    http://www.astrobio.net/
    news/modules.php?op=
    modload&name=News&
    file=article&sid=1535

    http://www.astrobio.net/
    news/article2509.html

    http://astrobio.net/news/
    modules.php?op=modload
    &name=News&file=article
    &sid=2429&mode=thread
    &order=0&thold=0

    These articles agree with the first 2. They all say 6 degrees C or
    1000 parts per million CO2 is the extinction point.

    The global warming is already 1 degree Farenheit. 11 degrees
    Farenheit is about 6 degrees Celsius. The book “Six Degrees” by
    Mark Lynas agrees. If the global warming is 6 degrees
    centigrade, we humans go extinct. See:
    http://www.marklynas.org/
    2007/4/23/six-steps-to-hell–
    summary-of-six-degrees-as–
    published-in-the-guardian

    “Under a Green Sky” by Peter D. Ward, Ph.D., 2007.
    Paleontologist discusses mass extinctions of the past and the one
    we are doing to ourselves.

    ALL COAL FIRED POWER PLANTS MUST BE
    CONVERTED TO NUCLEAR IMMEDIATELY TO AVOID
    THE EXTINCTION OF US HUMANS. 32 countries have
    nuclear power plants. Only 9 have the bomb. The top 3
    producers of CO2 all have nuclear power plants, coal fired power
    plants and nuclear bombs. They are the USA, China and India.
    Reducing CO2 production by 90% by 2050 requires drastic action
    in the USA, China and India. King Coal has to be demoted to a
    commoner. Coal must be left in the earth. If you own any coal
    stock, NOW is the time to dump it, regardless of loss, because it
    will soon be worthless.
    $100 million to teach people that nuclear power is safe and the only thing that works.

  • Robert Hunt Robinson on February 13, 2008 1:06 pm

    I could put these threats in any haphazard order and the list’s logic would be valid, but it still wouldn’t put humanity back squarely on the tracks. The number one existential threat is ignorance. Let’s put $100M into educating the population of planet and I’m sure we’ll then be able to face and solve any problem we are faced with.

  • Frank Sudia on February 13, 2008 2:58 pm

    $20 Biological viruses, ease of creation
    Not an existential risk, but not enough work being done and 30% of us could die is a few months.
    $20 Nuclear holocaust
    Nuclear winter IS an existential risk, and could actually happen tomorrow.
    $20 Space Threats, asteroids, gamma burst, etc.
    $20 Nanotechnology gray goo…
    $10 Other — earth swallowing micro black holes from new particle accelerators. Maybe this is why SETI finds nothing.
    $10 Extraterrestrial invasion…
    Might be worth taking a sober look at our survival options.

    $0 Superintelligent AI un-friendly…
    This is a legal and political issue, not a technology problem. AIShield risks being total Luddite obstructionism! (Work on curbing Microsoft.)
    $0 Environmental global warming…
    Not an exsitential risk, we’ll survive
    $0 Governments abusive power
    Not an existential risk, we’ll survive
    $0 Simulation Shut Down if we live in one…
    You won’t care if the simulation ends, will you?

    Post the runner-up ideas in a second ranked list near the top, so future commenters can review and be inspired by them, not just your chosen few categories.

  • […] hipotético de 100 millones para contraarrestar 10 riesgos fatales para el planeta? A través de una encuesta se realizó esta pregunta con el fin de captar la percepción de peligro que tiene la gente ante […]

  • Press to Digitate on August 18, 2008 9:47 pm

    $20M Superintelligent AI un-friendly…
    $20M Nanotechnology gray goo…
    These threats deserve special consideration out of respect for their sheer inevitability. We *KNOW* that they will occur, that they will occur within the next 20 years, and that it is only a question of where and when. There are too many nucleation sites from which Strong AI or Carbon-fixing Nanoreplicators might emerge, and too many ‘good’ motivations prompting them to be developed for well-intentioned purposes. Every year that passes, the tools and enabling technologies for each become cheaper, more powerful and more readily available. These are not occasional rarities like a Big Space Rock or Coronal Mass Ejection; these will happen ONCE, and then, Its All Over — unless we can somehow prepare ahead of time.

    $10M Biological viruses…
    While the technology for genetic manipulation and synthetic life also becomes exponentially cheaper, more powerful and more readily available year by year, and it is likely also inevitable that someone will create a virulent new synthetic pathogenic organism that defies common biological defenses, in this case the tools to counter it also grow more potent with the passage of time. We live in an age when the DNA of any new organism can be sequenced in a matter of hours, a cure modelled by computer within days, and industrial quantities synthesized in biorefineries within a month. Nevertheless, its inevitability places it high on the list of priorities.

    $10M Space Threats asteroids…
    The Big Space Rock has to be on the list because its happend before, and will happen again. We have many ways of dealing with the problem should it occur, if we are prepared, adn a large enough detection effort finds it in time.

    $10M Environmental global warming…
    While the technologies to solve the Environmental Energy issue already exist, if they are not disseminated and implemented in time on a broad enough scale, soon enough, the Clathrate Gun (Methane released from the melting permafrost) will render the planet uninhabitable within the century. 4,000 Billion Tones of Methane (>20x as potent a GHG as CO2) will dwarf anything man can produce as a driver of Climate Change; we must mine the atmosphere for the Carbon we need. Atmospheric Carbon Capture coupled with Algae Based Biofuels can fix this problem.

    $10M Extraterrestrial invasion…
    The million-plus annual UFO sightings reported globally, hundreds of thousands of abductees and hundreds of physical trace cases are absolutely convincing that “They” are real, and they are Here. Even the scientific mainstream has been forced to recognize that planets, with water, and conditions suitable for life are not only ‘not rare’, but are probably quite common in the universe. However, the apparent evidence indicates that this threat has already happend, but with no — or very subtle — negative effects from it. Even so, it is worthy of serious research and analysis, sooner rather than later.

    $05M Governments abusive power…
    The 2008 FISA revision failed to insert the Constitution between our government and our personal electronics, which are about to become far more ‘personal’, with the advent of Brain/Computer Interface technologies. Neurochips are a reality today, with more than 200,000 already implanted for Cochlear hearing, artificial retinas, and brain pacemakers for Parkinsonism, Epilepsy, OCD, Obesity, and Depression. Microwave-based Voice-to-Skull technology (“Government Mind-Control Rays”) are now also an acknowledged reality. Orwell was right, just 30 years or so off in his timing.

    $05M Nuclear holocaust…
    This ranks low becuase of its extremely low probability at the Existential Risk level. Israel/Iran or India/Pakistan nuclear exchanges would not pose such a risk, though they would be damned inconvenient for the participants, and neighboring bystanders in the region. However, better dissemination of Anti-Ballistic Missile technology and methods for screening ships and containers at sea for onboard WMD could substantially eliminate this threat category.

    $00M Simulation Shut Down if we live in one…
    This should not be on the list, not because its impossible, but because, even if true, there is nothing that money could be spent on which would make any difference. What would you do? Build an electronic “prayer machine”, in hopes of contacting the Simulators directly, through the vacuum aether? If we live in a simulation, that aether itself is suspect, and they are reading these texts as we type them anyway.

    $10M Other
    (1) A Temporal Modem is presently under construction at the University of Connecticut by Dr. Ronald Mallett. Unless the prevailing models of Relativity and Quantum Mechanics are fundamentally WRONG, this device cannot fail to work. It will become cheap to replicate; the knowledge to do so is already in the public domain. It has the potential to destroy causality and trigger chronoplexy on a planetary scale. This cannot be stopped — even if Mallett were inhibited by whatever means, the design is already widely distributed, and some grad student, sooner or later, will inevitably build one that works. So, the best we can do, as with Strong AI and Grey Goo and Synthetic Plagues, is to prepare to somehow detect and deal with its consequences.

    (2) Just because the Relativistic Heavy Ion Collider didnt create blackholes, strangelets, or vacuum instability does not mean the Large Hadron Collider will not. Or the next biggest machine after that, or the next one, etc. Sooner or later, one of these Big Science Machines is going to produce something unexpected, at energies where unexpected most likely equals dangerous. Thus far, no research has been devoted to mitigation strategies, techniques, and technologies, which is odd given the very high probability of something eventually going wrong.

    (3) When (not ‘If’) room temperature superconductors are commercially introduced, one unexpected result will be the rapid development of a Thanatronic Interface; a device which perfects Electronic Voice Phenomena, enabling reliable, high-fidelity Instrumental Transcommunication with the “Dead”. In every other instance in scientific equipment where a Germanium Diode (used as a detector of subtle EM fields) has been replaced with a Superconducting Diode, a sensitivity increase and improvement of signal-to-noise ratio of Three Orders of Magnitude has been observed. There is no reason to think that EVP detection will be any different. While this may not pose an existential threat, the inevitable advent of reliable electronic communication with the dead will certainly change human society as profoundly as anything else one can imagine. It is worth serious study of the implications and consequences of its development.

    Of course, if a wavefront from the Gamma Ray Burst of Wolf-Rayet 104 strikes Earth in December, 2012, as is indicated, we may have bigger and more immediate problems to contend with than much of the above.

  • Bon Davis on December 31, 2008 3:12 am

    I will go out on a different limb here. I would allocate $50 million to unfriendly AI. Most of the space born threats, well there isn’t really much we could honestly do. Especially near-term. I agree with earlier comments: space invasion, government abuse and simulation shutdown don’t deserve to be mentioned (how do you fund against a government abuse of power?). AI is a more imminant threat and different from the others: it is the one technology that could actively turn against us, pose a threat by indifference (not actually hostile, see Yudkowski), and the others actually take a human to push the button or open the vile. Still — $20 million to nanotechnology, $10 million to nuclear, $10 million to biological (both of these life will persist without help, if even limited)amd $10 million to environmental threats.

  • A.H. Jessup on July 6, 2009 11:09 am

    There are two major vectors–energy and water–which define the complexities of most of these symptoms.
    The social inversions that lead to genocide and government abuse tend to go away where a sufficiency of these two elements is available, combined with enough education to use them positively.

    Global warming is an energy dysfunction; most of the political complexities are water-driven and energy-driven. Virus control is a lot easier, as wlel, when these are resolved.

    The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.

  • DPirate on January 18, 2010 1:09 am

    Our only existential threat is humanity. Soon enough, what little there is to eat will be full of poisons. Maybe some elite cadre will be able to consolidate enough power to save us from ourselves, but who wants to live like that. Better the whole thing comes down and we start over in a few thousand years. False start.

  • DPirate on January 18, 2010 1:12 am

    “The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.”

    Sounds like Lenin, lol.

  • Willard Wells on September 9, 2010 9:45 pm

    We can be sure that natural hazards are insignificant because humankind has survived 200,000 years of exposure. By contrast, we’ve had only about 60 years of adaptation to man-made existential threats, and new ones appear continually.

    Nuclear holocaust does not threaten extinction because it would happen in the Northern Hemisphere. New Zealand’s South Island and Tierra del Fuego are insulated from its effects by at least one complete Hadley cell of atmospheric circulation.

    The biggest risks are uncategorized surprises. For example, a mad billionaire (trillionaire?) ‘hears’ orders from God to exterminate the human race. Or pollutants in the ocean produce mutant phytoplankton that emit poison gas.