Menu

Blog

Jan 10, 2008

Poll: Top 10 Existential Risks

Posted by in category: existential risks

How would you allocate a hypothetical $100 million budget for a Lifeboat Foundation study of the top 10 existential risks… risks that are both global and terminal?

$?? Biological viruses…
$?? Environmental global warming…
$?? Extraterrestrial invasion…
$?? Governments abusive power…
$?? Nanotechnology gray goo…
$?? Nuclear holocaust…
$?? Simulation Shut Down if we live in one…
$?? Space Threats asteroids…
$?? Superintelligent AI un-friendly…
$?? Other
$100 million total

To vote, please reply below.

Results after 80 votes updated: Jan 13, 2008 11 AM EST

$23.9 Biological viruses…
$17.9 Space Threats asteroids…
$13.9 Governments abusive power…
$10.2 Nuclear holocaust…
$8.8 Nanotechnology gray goo…
$8.6 Other
$8.5 Superintelligent AI un-friendly…
$7.2 Environmental global warming…
$0.7 Extraterrestrial invasion…
$0.4 Simulation Shut Down if we live in one…
$100 million total

121

Comments — comments are now closed.


  1. Jeremy Shepard says:

    $25 Biological (viruses, etc)
    $5 Environmental (global warming, etc)
    $25 Nanotechnology (gray goo, etc)
    $10 Nuclear (holocaust, etc)
    $5 Governments (abusive power, etc)
    $5 Space Threats (asteroids, etc)
    $25 Superintelligent AI (un-friendly, etc)

  2. I wouldn’t spend money studying any of those risks. It isn’t worthwhile — the results that come out would be more or less of no use in preventing anything bad, both because most of the real problems are doubtless not yet understood, because there is so little you could say on most of them, and because no one who you want to listen is going to listen, whereas the people you don’t want to listen are going to take anything that gets said and distort the hell out of it.

    If I had $100M to spend, I’d put it towards something of real use to humanity like anti-aging research.

  3. Jon Bischke says:

    I’m not sure how informed this is but here goes…

    $30 Biological viruses…
    $25 Environmental global warming…
    $0 Extraterrestrial invasion…
    $1 Nanotechnology gray goo…
    $40 Nuclear holocaust…
    $1 Governments abusive power…
    $1 Simulation Shut Down assumes we live in one…
    $1 Space Threats asteroids…
    $1 Superintelligent AI un-friendly…

  4. Bill Hunt says:

    My 10 year old son says:
    Global Warming $50m
    Nuclear Holocaust $30m
    Superintelligent AI $20m

    If I had $100m to invest towards a better humanity, I think I’d put most of it towards positive technical benefits, and a small amount towards risk reduction. I.e. maybe $90m towards health benefits that could be achieved via nanotechnology and $10m towards gray goo scenario analysis. Heck, make that a $95m to $5m ratio.

  5. Bob Mottram says:

    $20 Space threats
    A threat from space, in the form of asteroid or comet strikes, or more exotic fates such as intense X-ray blasts from exploding stars are probably the ultimate existential risks in that they have the potential to kill most or all existing life on earth. We know that these things will naturally occur from time to time, and that there have been several large mass extinction episodes in earth’s history.

    Unfortunately our ability to have any effect upon these events is very limited. Talk of nudging asteroids using ICBMs I think remains largely in the realm of fantasy, and since such objects are not luminous we may not detect the next major collision before it actually occurs. About the best we can do is track objects on nearby orbits and seek to build sustainable environments in outside of the home planet.

    $50 Environmental global warming
    Runaway climate change also has the potential to kill all life on earth. There does seem to be evidence that some kind of catastrophic climate change may have occurred on Mars in the past. We do have the ability to do something about this and so this is perhaps where most of the money should be spent.

    $10 Governments abusive power
    Governments aren’t an existential risk in themselves, but they can influence factors which could bring about other types of existential risk. Ensuring good governance throughout the world should be a priority, but this can only be achieved by empowering individuals to take control over their own lives, breaking up monopolies and other anti-competitive forces limiting progress, and ensuring that nobody is marginalized or excluded from economic participation. Bad governance leads to resource shortages and wars which increase the chances of existential risk.

    $10 Nuclear holocaust
    Although nuclear armageddon isn’t something which you hear much about these days most of the weapons accumulated during the cold war still exist and could potentially be used. Even if you very conservatively assume that the average nuclear weapon is no more powerful than the ones used against Japan at the end of WW2 there are more than enough nuclear weapons to kill most of the world’s population (assuming that they’re mostly concentrated in cities). Bad governments still have the potential to start a localized nuclear conflict which could quickly escalate out of control according to the positive feedback effect of mutually assured destruction.

    $5 Biological viruses
    Again, biological viruses are not really an existential threat although they do have potential to cause mass casualties and major economic disruption. Mutation of naturally occurring viruses such as H5N1 are a problem, but the biggest threat will come from man-made viruses or the “counter adaptation” effect upon natural viruses when subjected to pressures introduced by new genetically modified organisms introduced into the environment. Good government and responsible use of GMOs backed by reasonable scientific testing can help to mitigate this risk.

    $5 Nanotechnology gray goo
    Nanotech and biological threats can be considered to be the same thing, since living organisms are really just large collections of nanomachinery. Irresponsible use of nano/bio technologies could cause critical damage to key biological mechanisms which ultimately sustain the biosphere. Again, good government backed up with regulation, inspection and good testing of new products can help to reduce these risks.

    All of the following I would consider to be of negligible risk.

    $0 Extraterrestrial invasion
    Since we still don’t know precisely how life started on earth it’s entirely possible that we might be the aliens, brought to earth as bacteria transported inside comets or meteorites.

    $0 Simulation Shut Down assumes we live in one…
    If we are living inside a simulation there’s really nothing that we can do to prevent it from being shut down. Space colonization would be no defense against this.

    $0 Superintelligent AI un-friendly
    Superintelligent computer viruses or crazy AIs could cause significant disruption and economic damage, but wouldn’t constitute an existential threat unless they have access to manufacturing or critical safety systems (for example being able to launch nuclear missiles). Provided that manufacturing and weapons systems are designed with appropriate attention to safety the chances of them being accessed or misused by rogue AIs should be minimal.

  6. James Bryant says:

    Of the named risks presented, I consider extraterrestrial invasion and simulation shut down to be completely unworthy of funding at any level. I see the other 7 as having the same basic root cause … the (growing) gap between technological and biological evolutionary rates. So long as we are primarily meat creatures, our ability to adapt our society to increasingly complex and potentially dangerous technologies is limited. The growth of our individual consciousnesses, and the societies we spawn from them, is limited by hardware.

    I’d spend all the money studying the ‘meta-risk’ inherent in the simple fact that we’re human, and look for ways to mitigate that. I’d call this, “Other: Technological –vs- Biological evolutionary rates”.

    Cheers!

    Just realized a mistake … threat of asteroid strike is clearly not a new risk created by technology, and should be treated separately. That should have read, “I see the other 6 as having the same basic root cause …”

    I’d allocate the money the same way. :)

  7. David Hart says:

    Until my crude wetware-based-bayes-guesser receives more/better information (which may include incremental knowledge of significant technological progress in certain areas in the coming years):

    $60 Biological

    $20 Space Threats — asteroid impact, gamma burst, etc.

    $10 Gray Goo
    $10 Unfriendly AI

    $0 Governments — indirect, therefore not an existential risk. Certain specific decisions of governments could pose existential risks however.

    $0 Environmental — the effects of climate change pose huge risks to millions of lives and could potentially disrupt the global economy on scales of trillions of dollars, but are unlikely to cause full planet-scale extinction. The political willpower to address climate change is approaching full swing, so I believe Lifeboat Foundation funds are better spent on more acute risks.

    $0 Nuclear — I believe it’s likely that the human race would survive a full-scale nuclear holocaust.

    $0 Extraterrestrial — any mitigation is likely to be observed, and therefore ineffective.

    $0 Simulation Shutdown — how can this possibly be mitigated? prayer? ;-)

  8. Paul Tozour says:

    Where is terrorism on this list? Is terrorism not a threat?

    No, I’m not talking about suicide bombings and IEDs: those are obviously small-scale threats. I’m talking about the fact that the amount of killing power that can be concentrated in the hands of a small number of individuals is growing exponentially, similar to Moore’s Law. 9/11 was kid stuff compared to what dedicated terrorists could do with the technology the 21st century will provide.

    The first and foremost threat to our existence comes from the people who are openly dedicated to killing us. They may number only in the millions and live far from us, but the nanotechnology and biotechnology of the next century will provide them with all the tools they need to slaughter the entire human race.

    God help us if the technologies of the future come into the hands of those willing to use them in the name of terror.

  9. theymos says:

    $5 Biological viruses
    $10 Environmental
    $0 Extraterrestrial
    $20 Governments
    $5 Nanotechnology
    $30 Nuclear holocaust
    $0 Simulation Shut Down
    $15 Space Threats
    $10 Superintelligent AI
    $5 Other

    Biological viruses I don’t think are capable of completely destroying humanity. It would be great to reduce the damage, though.

    Global warming is a big risk, but I think that society will sort this one out by itself. It’s just too big of a problem to ignore for too long.

    I don’t think extraterrestrials are nearly as common as most people think. I doubt there is a single other sentience in this galaxy. So in my opinion the risk is extremely low that we will ever have troubles with this.

    Government issues are a big risk, but I’m not sure I like the way the Lifeboat Foundation proposes we deal with it. I don’t really like the idea of more surveillance and less privacy. It would be great if the government was infinitely benevolent; but that is very far from the truth.

    I don’t think we’re near enough to grey goo to worry about it just yet.

    A nuclear holocaust is the biggest issue. This could happen tomorrow; the technology already exists and the threat is very real. The Lifeboat Foundation should devote a significant amount of its resources to this issue to make sure that if the worst happens, humanity survives. I would like it if the Lifeboat Foundation would work with cities to make sure that nuclear fallout shelters are up-to-date and working properly. Maybe even help pay for new or better facilities.

    I think it is a good possibility we live in a simulation, but there’s nothing we could do about it. It’s not worth any effort.

    Space threats are pretty deadly, but rare. We should be on the lookout for it, but not at the expense of other, closer existential risks.

    AI could be a threat, but it would probably actually be a boon. AI growth should be positively promoted like Singinst is doing.

    And, of course, we should always be thinking of other possible existential risks. A space station would be a great idea, just in case.

  10. $15 Biological viruses…
    $35 Environmental global warming…
    $10 Nanotechnology gray goo…
    $5 Nuclear holocaust…
    $30 Space Threats asteroids…
    $5 Other

    Bio & Enviro have direct, present-day application.

    Nuclear needs a bit more funding for detection & safety.

    Asteroid impact prevention requires serious lead time, so ramping the capacity to do so up quickly is important.

    Nano research would largely focus on developing policies and management strategies for present-day related tech, so as to build a good policy basis as MNT gets closer.

    “Other” means funding R&D into improving our ability to identify transfigurative threats before they become unmanageable.

  11. Given the limited budget, I’d focus it on threats that could materialize in the next 10 years or so, and on areas in which we can actually make a difference:

    $40 Biological viruses…
    $0 Environmental global warming… (other organizations
    are funding this already).
    $0 Extraterrestrial invasion… (money won’t help here)
    $5 Governments abusive power…
    $10 Nanotechnology gray goo…
    $10 Nuclear holocaust…
    $0 Simulation Shut Down assumes we live in one…
    $10 Space Threats asteroids…
    $20 Superintelligent AI un-friendly…
    $5 Other (investigate risks that we’ve overlooked)

  12. Zaid Abdulla says:

    I believe higher priority should be given to threats that will given us little time to fight back once they happen:

    $25 Space Threats asteroids…
    $25 Environmental global warming…
    $15 Biological viruses…
    $15 Nuclear holocaust…
    $10 Governments abusive power…
    $10 Nanotechnology
    $0 Superintelligent AI un-friendly…
    $0 Extraterrestrial invasion…
    $0 Simulation Shut Down
    $0 Other

  13. V says:

    $20 Biological viruses…
    $15 Environmental global warming…
    $0 Extraterrestrial invasion…
    $0 Governments abusive power…
    $5 Nanotechnology gray goo…
    $5 Nuclear holocaust…
    $0 Simulation Shut Down assumes we live in one…
    $30 Space Threats asteroids…
    $20 Superintelligent AI un-friendly…
    $5 Other (Black Swans!)

  14. Rok says:

    $25 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $0 Governments abusive power…
    $30 Nanotechnology gray goo…
    $5 Nuclear holocaust…
    $0 Simulation Shut Down assumes we live in one…
    $5 Space Threats asteroids…
    $30 Superintelligent AI un-friendly…
    $5 Other

  15. Jackie says:

    $25 Biological viruses…
    $30 Environmental global warming…
    $0 Extraterrestrial invasion…
    $5 Governments abusive power…
    $30 Nanotechnology gray goo…
    $5 Nuclear holocaust…
    $0 Simulation Shut Down assumes we live in one…
    $5 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $0 Other

  16. I would invest $80M towards the AI study and $20M towards the nanotech study. Climate change, viral threats, governmental abuse, and space risks are important but already either receive large amounts of attention and funding or are very low probability events. ET invasion and simulation shutdown are non-risks which routinely damage our credibility when mentioned in the same context as the others.

    When discussing risks, it’s important to distinguish between the physical cause of disaster (biotech, nanotech replicators, etc.),

    meaning the actual risk, and possible tangential factors which could release it: economic or political unrest, governmental abuse, terrorism, bad moods etc. I prefer to focus on the former over the latter, because the latter factors tend to have been around since the dawn of civilization and distract the discussion away from the true sources of risk — these novel technologies themselves.

    Another important risk you missed is synthetic life, which falls somewhere between viruses and nanotech. Considering that synthetic life could be created in a lab this year, it’s certainly a proximate risk.

    Thanks for doing this, Bruce!

  17. Zenon Kulpa says:

    $0 Biological viruses… — manageable with current means;
    $0 Environmental global warming… — a humbug;
    $0 Extraterrestrial invasion… — very improbable;
    $0 Governments abusive power… — we manage ourselves…
    $10 Nanotechnology gray goo… — let us know more…
    $0 Nuclear holocaust… — very improbable;
    $0 Simulation Shut Down assumes we live in one… — are you joking?
    $70 Space Threats asteroids… — real and can produce as a byproduct the much needed space infrastructure;
    $0 Superintelligent AI un-friendly… — very improbable;
    $20 Other: Megavulcanism — we should know more, and fast…
    $100 million total

  18. I think the biggest immediate threats we face right now are a natural biological pandemic, given this age transcontinental flights and perpetually deeper interaction with animals historically isolated from humanity, and asteroid impacts, so I would advocate devoting a significant amount of resources right away to these. While a viral epidemic would not likely kill all people, it could certainly instigate a global economic and technological collapse to such an extent that we might enter another dark age, and thus be wiped out by another event, such as an asteroid impact.

    In the long term, artificial or synthetic life, Nanotechnological weapons, and artificial intelligence, are our biggest concerns.

    Global Warming might be a big issue, but it is not a civilization killer, and attention focused on it will discourage the development of technological infrastructure and the economic growth needed to technologically combat virtually every other threat we face. Only fighting Global Warming requires curtailment of energy use and thus technology, while everything else requires rapid technological growth and ultimately to spread among the stars. An asteroid on collision course with the planet will not care what our ‘carbon foot print’ is. Unfortunately Global Warming is the threat everyone pays attention to, even though it poses a negligible existential threat.

    I would also ad large geological events, primarily caldera volcanic eruptions. The closest humanity has ever come to extinction was through an eruption of this kind, where it is thought the entire human population was reduced to a mere 1,000 adults. Underwater eruptions would leave little long term geological evidence, and could devastate coastal cities worldwide, instigating yet another massive economic and technological collapse.

    Luckily many of the threats we face, Nuclear Holocaust, Asteroid Impacts, Grey Goo, Epidemics, Caldera Eruptions, etc, require the same path to combat, starting with limited length use underground bunkers and food / water storage to remote and isolated self sustaining bunkers all the way to self sustaining space station. Lifeboat Foundation should focus energies on encouraging the development of more robust and off grid technology (such as the solar powered 1kW Sterling Engines which can go 14 years maintenance free that Infinia Corporation is developing) Every effort should be encouraged to find new ways to produce power, clean drinking water, and plentiful food supplies, that are simple, efficient and, most importantly, are less expensive than massive centralized systems. Being less expensive which help ensure their promulgation into society and reduce the consequences of large scale events while simultaneously refining the technology needed to operate closed system ecosystems, such as remote bunkers or space stations. These efforts could include funded research for proposals, competitions, or encouraging publicity for existing companies working on these kinds of innovations. In this vein for “other” I suggest a ‘cumulative threat mitigation program’ to determine what technologies are most beneficial in combating the many threats we face and encourage the development of those.

    Incidentally, the X prize Foundation is offering the chance to propose a new X Prize, a suggestion from the Lifeboat Foundation for a prize focused on creating pragmatic achievements in enclosed ecosystems could be very valuable in this regard.
    http://www.xprize.org/x-prizes/propose-an-x-prize

    Lifeboat Foundation might consider starting a board to investigate the potential of mitigating the chance of large global economic recessions, which would significantly hinder development of technology required to combat other threats.

    Another program proposal would be to investigate ways to encourage the spread of representational constitutional governments and take whatever actions are possible to facilitate this. Totalitarian dictatorships pose great threats to all of humanity, not the least of which being the people who live in them, and pose those threats its numerous ways. They divert attention and resources to purposeless wars, where non democracies start most wars and no two democracies have ever been at war with each other. Their reduced health and technological infrastructure will facilitate the spread of pandemics, (China’s secrecy policy lead to a death rate after infection 10 times greater from SARS than any western nation) and exacerbate the damage from natural disasters. Their oppressive policies lead to social and economic stagnation, robbing humanity of millions of brilliant minds. And worst they indoctrinate hatred and murderous intolerance which may breed the terrorist that wipes most of humanity out.

    Finally, one last program would be to combat general nihilism and depression that plague humanity, we see programs like the “Voluntary Human Extinction movement” and terrorists like Ted Kaczysnki, and I wonder perhaps if too much of the human population might simply lose the desire to live.

    $20 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $10 Governments abusive power…
    $10 Nanotechnology gray goo…
    $5 Nuclear holocaust…
    $0 Simulation Shut Down assumes we live in one…
    $15 Space Threats asteroids…
    $10 Superintelligent AI un-friendly…
    $30 Other — “cumulative threat mitigation” program

  19. Matt McGuirl says:

    5 Offensive Biological Weapons / Rogue Natural
    Diseases
    0 Climate
    0 ET Invasion
    20 Sousveillence (was Government’s abusive power…)
    45 Offensive Nanoweapons
    0 Nukes
    0 Simulation Shut Down
    10 Space Threat
    15 UFAI
    5 Other

    Here’s what I was thinking when I allocated my fictitious funding:

    The sorts of threats that are likely to pose global-scale risks will be fast movers. I anticipate that the time between the initial emergence and/or detection of a major threat and when the probability of successfully containing it will be quite brief. Maybe days or weeks but almost certainly not months in nearly all cases.

    To counter the high-velocity risks we need to learn of them prior to their emergence. Failing that, we need to learn of them more or less immediately.

    To detect emerging, or newly emerged threats we will need lots and lots of data. That’s where sousveillence comes in. With large numbers of ordinary citizens voluntarily submitting data feeds from their own perspective and, in time, the perspective of their smart-dust based sensor networks a greater understanding of the sorts of activities that can lead to truly existential risks can be gained.

    The sort of system that can reliably take in and process such large volumes of data at very high rates won’t be trivial to invent. Furthermore, it will need substantial resources to develop, deploy and manage. The full promise of such a system will probably only be realized when some sort of AI gets folded into the data analysis and event correlation functions.

    I weighted the NanoShield program’s funding so highly and under-weighted funding for Offensive Biological Weapons / Rogue Natural Diseases” the for a few reasons:

    1. The early warning system that will be a major component of the NanoShield will probably get much of its information from surveillance and sousveillance based sources.

    2. The sorts of response capabilities that would be used to counter the offensive use of nanoweapons would probably be able to be used to counter “Offensive Biological Weapons / Rogue Natural Diseases.”

    Space-based threats (i.e. large rocks that will hit Earth) may be detected months or years in advance. If that comes up we’ll adjust funding as needed.

    PLEASE NOTE:

    1. I _HATE_ the fact that to survive the 21st century we’re almost certainly going to have to rethink some of our notions of privacy. I don’t want to know what you and your significant other do in your home. Unless, or course one of the things you’re doing is manufacturing a WMD (bio, nano, nuclear, etc.) of some sort or planning an attack of some sort. Then we all have a very legitimate right to know what you’re doing.

    2. Please keep in mind that it is the stated intent of the Lifeboat Foundation to work with the proper local authorities when the need arises.

  20. The best use of a substantial portion (perhaps even the majority) at the present time would be with meta-level research. In other words, the money will have greater impact by undertaking a period of analysis to better determine where the reamining money should be spent. Essentially I’m applying some game theory to the problem. What’s most useful is not thinking about the scenarios as much right now, but thinking about how to think about the scenarios. Here’s my working theory at this point:

    (Scope, Likelihood). Weight is a function “W” of (Scope, Likelihood), W(S,L), starting with W(S,L)=S*L [function to be improved later]. Allocation is the weighted value of the particular W value for a risk in proportion to the sum of all W values.

    Scope = scope and impact on sentient creatures in existence at time, and those surviving (and the effects on descendants, other sentients in the [uni|meta]verse, etc in perpetuity) (0=none, 10=catastrophic)

    Likelihood = relative likelihood of such scenario happening and being able to alter it (0=low, 10=high) [arbitrary scale, per unit of time].

    $20M Meta-level research into unknown scenarios / risk analysis to better determine future allocation.

    $50M Endowment / allocated for future analysis based on results of meta-level analysis as confidence levels increase.

    Remaining $30M, short term research. Some are $0M only due to rounding. I would allocate at a more granular level.

    A(7,3)=6.4% [$2M] Intentional annihiliation (nuclear war, etc)
    A(7,8)=17.0% [$5M] biological pathogens, natural or engineered
    A(9,4)=10.9% [$3M] extinction by extraterrestrial contact, invasion, pathogens, etc
    (7,2)=4.3% [$1M] asteroid impact
    (1,2)=0.6% [$0M] gamma ray burst
    (1,1)=0.3% [$0M] rogue black hole
    (7,6)=12.8% [$4M] geological disasters, global warming, etc.
    (8,8)=19.5% [$6M] unfriendly superintelligence
    (7,9)=19.1% [$6M] unanticipated / misuse of nanotech
    (6,1)=1.8% [$1M] unforeseen cosmological scenarios — physics of universe changing, runaway inflation, simulation shut down, etc.
    (8,3)=8.3% [$2M] unforeseen consequences of particle accelerator use, creating inflatons, etc.

  21. Martin Sommerfeld says:

    $30 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $0 Governments abusive power…
    $20 Nanotechnology gray goo…
    $25 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $10 Space Threats asteroids…
    $10 Superintelligent AI un-friendly…
    $5 Other
    $100 million total

  22. Matt Metcalf says:

    $20 Biological viruses
    $50 Environmental global warming
    $0 Extraterrestrial invasion
    $0 Governments abusive power
    $5 Nanotechnology gray goo
    $5 Nuclear holocaust
    $0 Simulation Shut Down if we live in one
    $5 Space Threats asteroids
    $15 Superintelligent AI un-friendly
    $0 Other

    I’d also like to add that Chris Haley was completely correct in his assertion that game theory needs to be applied. There are a great many threats on the list that would be devastating, but some more so than others. Also, some of the threats have a lower probability than others. I’d like to add one more variable to Chris’ formula, though: solvability. How likely is it that we could do anything about the problem, even if we had unlimited resources? For some of the threats, like biological threats and global warming, there are things we can easily put some money into addressing immediately. For Simulation Shutdown or Extraterrestrial invasion, there wouldn’t be much we could do, so even if they would be devastating, why put any resources toward them?

    Once we can agree on the Scope, Likelihood, and Solvability of each threat, a real analysis of where to allocate resources can begin.

  23. Sorry about the typos in my previous post — I’m normally careful about proofreading. Matt makes a good point about solvability. There are scenarios which we may think at the present time can’t be altered to have positive outcomes. However, additional analysis may give us unexpected insight or even change the desirability of a particular scenario. For example, in terms of a simulation shutdown, it might be possible to construct a logical argument in favor of a shutdown: perhaps the simulation participants are awarded for their ability to understand that they are in a simulation by transcending to the next level, etc. [I’m just brainstorming here, but you get the idea.] On the other hand, I can’t think of an argument favoring nuclear war.

    When thinking about this problem, I recalled Eliezer Yudkowski’s article just yesterday about how
    0 and 1 are not probabilities
    . This explains my rationale for allocating some small amount to unexpected scenarios.

  24. Robert Bast says:

    $15 Biological viruses…
    $2 Environmental global warming…
    $4 Extraterrestrial invasion…
    $5 Governments abusive power…
    $2 Nanotechnology gray goo…
    $15 Nuclear holocaust…
    $10 Simulation Shut Down if we live in one…
    $15 Space Threats asteroids…
    $5 Superintelligent AI un-friendly…
    $25 Other — Global Cataclysm in 2012

    The 2012 event could be caused by the Sun, various space nasties like cosmic rays or a supernova, a pole shift…

  25. Jim Davidson says:

    $ 2.0 M Biological viruses…
    $ 1.0 M Environmental global warming…
    $ 30.0 M Space Threats asteroids…
    $ 0.0 Superintelligent AI un-friendly…
    $ 5.0 M Nanotechnology gray goo…
    $ 1.0 M Nuclear holocaust…
    $ 0.0 Other
    $ 60.0 M Governments abusive power…
    $ 0.0 Simulation Shut Down if we live in one…
    $ 1.0 M Extraterrestrial invasion…
    ======
    $100.0 M total

  26. Gil Shefer says:

    I believe that currently the two main threats are biological and nuclear, and the best way to tackle them is more responsible leadership, so I put -
    $25 Biological
    $25 Nuclear holocaust
    $25 Governments

    The other $25 will go to the more long-term threats -
    $5 Environmental global
    $10 Nanotechnology
    $10 Superintelligent

  27. nawitus says:

    $70 Superintelligent AI un-friendly…
    $30 Nanotechnology gray goo…

  28. Khannea Suntzu says:

    Above risks are all risks to the majority of humanity. If we include those risks of that category I’d also include another risk, which seems to be completely ignored.

    I’d add the risk of an elite of ultra-rich humans, using robotics, nanotechnology, limited AI, nonlethal and lethal weapons, mind control and their already obscene affluence to consolidate their power over the world and simply exclude the rest of humanity. And personally I would regard that possibility as a bigger risk to the majority of humanity than any of the above.

    At some point in the next decades an ever increasing number of economic activity can be done by machines. The humans left unemployed by such shifts could be left useless, excluded (especially in right-wing countries that have yet to accept welfare as a part of civilized society) and ultimately disallowed to reproduce. The rich will use means at their disposal (ad they clearly have a track record of doing already) to curtail space, resources, food, energy or opportunity to those they regard as useless, by whatever definition they themselves dictate.

    Power begets power. People with power want more power, at the exclusion of all others. Te biggest risk to people living today is an terminal state of exclusion of the vast majority of humanity. The rich may act contrary to democratic principles, may do so with “us” being unable to do anything about it. They may decide the world is overpopulated (which is clearly is) and decide the unproductive and disowned are they problem.

    I urge the lifeboat foundation to start regarding the grossly inequitable disparities between obscene wealth in the hands of some and dehumanizing poverty in most others as one of the greatest existential threats to humanity — and to assertively act to oppose this ongoing degenerative process in todays world and economies.

  29. Brien OToole says:

    $100 million — nuclear

  30. Jo Anderson says:

    $30 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $50 Governments abusive power…
    $0 Nanotechnology gray goo…
    $20 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $0 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $0 Other

  31. cottus says:

    Khannea Suntzu, it appears, wishes the entire hypothetical $100 million budget to be spent on overcoming what is inherent in man and all creatures. What I would call survival of the fittest. I therefor would desire the entire hypothetical $100 million budget be spent on educating those such as Suntzu and all others bent on destroying our very humanity out of envy, fear, greed and/or lust for power.

  32. Nick says:

    Most of these threats would not result in the extinction of humanity.

    I would definitely seek to curb the effects of viruses, etc. A virus probably poses the greatest threat to the greatest number of people *right now*. But it wouldn’t cause us to go extinct. It would at absolute worst kill off 99% of our population. But that would leave millions of people with insane access to capital and resources. Hardly a recipe for extinction.

    Global warming poses no existential threat to humanity. In absolute worst case scenarios, it would force the relocation of hundreds of millions of people. There would obviously be chaos involved with that, if (in the worst case scenarios) it happened quickly. Otherwise its no problem.

    ET invasion isn’t a threat. I don’t believe a species with superior abilities would be violent.

    Governments abusing power is a problem for some people but it isn’t really an existential threat. Plenty of governments in the world abuse their power. People still live in North Korea, after all.

    The nanotechnology thing where everything turns into a grey goo, that sounds like an existential threat. But I don’t think its possible for something like that to form. You’d almost wonder whether it wouldn’t have done so by accident already and turned the earth into mush years ago.

    Nuclear holocaust, like a virus, is one of the greatest existential threats that many people face. But I don’t think it would threaten our species, unless someone got ahold of thousands of nukes and nuked the entire planet surface. Otherwise, even a nuclear war between the US, Russia, France, Britain, China, Israel, India, and Pakistan all at the same time leaves most of south America and Africa unscathed. Lots of radiation sure, but not existential threat.

    Simulation shut down is a serious threat, but would be impossible to stop. And discovering that it is a simulation would probably be a reason for it to stop. So its best that we don’t prod. 0 dollars.

    Asteroids and meteors are the only realistic existential threats on this list. I’d put all of the money on this one, assuming my only goal was to ensure the survival of humanity.

    I think a superintelligent AI would steward us like cattle, but wouldn’t necessarily seek to eliminate us.

    $?? Biological viruses…
    $?? Environmental global warming…
    $?? Extraterrestrial invasion…
    $?? Governments abusive power…
    $?? Nanotechnology gray goo…
    $?? Nuclear holocaust…
    $?? Simulation Shut Down if we live in one…
    $?? Space Threats asteroids…
    $?? Superintelligent AI un-friendly…
    $?? Other
    $100 million total

  33. Orion says:

    $24 Biological viruses…
    $24 Space Threats asteroids…
    $15 Governments abusive power…
    $15 Nanotechnology gray goo…
    $10 Nuclear holocaust…
    $5 Environmental global warming…
    $5 Extraterrestrial invasion…
    $2 Superintelligent AI un-friendly…
    $0 Simulation Shut Down if we live in one…
    $0 Other
    $100 million total

    I figure the ET invasion would cover most of the AI attack — good military space power. But we gotta cover the big, likely problems — Virii and an asteroid strike.

    Orion

  34. sherlock says:

    This thread conclusively demonstrates the pervasiveness of what must be the biggest threat to humanity… lack of critical thinking skills. Some of the “logic” on display here:
    - we would not be invaded by extraterrestrials because we are probably descended from them via spores.
    - we should focus on threats we would have little time to respond to, like asteroids… followed by global warming.

    How about we do something about the increasing rate of creation of morons, by a media and academia intent on scaring us into accepting world socialism?

  35. roy in nipomo says:

    $30 M Biological viruses…
    $ 9 M Governments abusive power…
    $20 M Nuclear holocaust…
    $40 M Space Threats asteroids…
    $ 1 M Other

    —–

    I suspect that I’d be wasting $9M as governments have lots more to spend to increase their power if they want (which enables the abuses); but, hey, it’s only money. :)

  36. Bob Hawkins says:

    $00 Biological viruses…
    Already adequately funded

    $00 Environmental global warming…
    Already overfunded

    $00 Extraterrestrial invasion…
    Weapons development already
    overfunded

    $20 Governments abusive power…
    Potential to negate rest of list

    $10 Nanotechnology gray goo…
    Nanotech is real, let’s get in front of the problems for a change

    $00 Nuclear holocaust…
    Covered by “Governments abusive”

    $00 Simulation Shut Down if we live in one…,
    Most likely way to assure we get shut down

    $20 Space Threats asteroids…
    Potentially a near-term threat

    $00 Superintelligent AI un-friendly…
    Not real yet, no target

    $50 Other
    Space colonization. The only solution to some existential threats and, long term, a possible solution to all of them.

    $100 million total

  37. tom says:

    $10 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $50 Governments abusive power…
    $0 Nanotechnology gray goo…
    $20 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $0 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $20 Food production

  38. Chris says:

    I’d return the $100 million to all those who earned it. Leaving that complaint aside, I under we’re to allocate money based on the relative risks to humanity. So here goes:

    $20 Biological viruses
    $50 Governments abusive power
    $10 Nuclear holocaust
    $20 Space Threats asteroids

    The first three overlap.

    None of the others are serious threats to humanity. In particular Global Warming can’t be, even if the rising seawater apocalypse fantasies comes true. People will simply move to higher ground.

  39. redherkey says:

    Funds should be appropriated only when there is sufficient likelihood and impact from the risk occurring, and when the appropriation of funds has sufficient leverage to treat the risk.

    The following two have provided sufficient historical evidence of their capacity and frequency in causing mass extinction:
    $ 30.0 M Biological viruses…
    $ 30.0 M Space Threats asteroids…

    The third would be mankind’s contribution to creating a threat similar to the virus threat, with potentially greater impact:
    $ 5.0 M Nanotechnology gray goo…

    For the other risks, our leverage is minimal. The risk of global cooling has unfortunately not been represented and continues to be postured in the faux-scientific “warming” context, ignoring serious climate disasters occuring in China and Brazil and tolerating false efforts such as Kyoto (which its signatories have ignored and permitted noncompliance, knowing full well it’s a useless political effort). Statistically, the global warming models have identified irrelevant coefficients from poor models that do not provide a rational foundation for attributing causation of “warming” to man-associated activities.

    However, substantial serious research has established strong correlation between solar and earth rotational cycles and cooling. Combined with exogenous events to this cycle model (asteroid impact, mantle plume eruptions), the earth is at serious risk from catastrophic cooling. With mankind choosing cities as the preferred locale for residency, we’re at significant risk for systemic shocks to food production systems, for instance, that we lack the capacity to address.

  40. John R. says:

    $80 million Biological viruses…
    $10 Other
    $5 Governments abusive power…specifically for finding an alternative to the U.N.
    $5 Nuclear holocaust… for handling of waste, accidents, and detection of materials in unauthorized hands
    $0 Environmental global warming… and $0 Extraterrestrial invasion… they are both on the same level of significance
    $0 Nanotechnology gray goo…
    $0 Simulation Shut Down if we live in one…
    $0 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…

    I would ask that those who are members of the global warming cult to explain to me how they think that the last ice age ended, what they propose to do about change to climate that result from major volcanic eruptions, how they will control the impact of sun spots on our climate, what they will do when the earth’s magnetic polls shift, and why do the high priest of the global warming cult use unproven assumptions and fraudulant data to promote their hoax.

  41. redherkey says:

    And I failed to mention, the remaining $35 million in my allocation was lost to government confiscation of a portion of my allocation via taxes. If I’d still be allowed to target the funds, I’d allocate it to Nanotech grey goo.

    Regarding Khannea Suntzu’s alarmed post, I would recommend reading Nicholas Taleb’s “The Black Swan.” The natural distribution of wealth (when a totalitarian government form is not involved) is a fractal distribution. In a sense, Bill Gates happened because the distribution needed someone to fill that gap, and he was prepared and also fortunate enough to have events occur to put him in that spot. But to attempt to engineer a distribution to remove such spots is unnatural and has even greater risks. At a minimum, your desire will require totalitarian structures to impose upon a natural power-law system. You will have to kill or enslave a large amount of the population, punish all who work hard, were born with talent, are creative and take personal risk to see their ideas bloom.

    You will also have to reward the lazy, redistributing that from those who earn to those that do not.

    When you see income disparity, you need to ask why you care and what your harm is. In my case, I realized after many years of despising “fat cat trust fund children” that I was only annoyed at my own insufficient effort. Addressing that deficiency, I excelled. For the true lazy trust fund children and other “unworthy rich,” don’t think for a moment that nature has forgotten them. Look at the pathetic plight of Britney Spears and Paris Hilton. Wealth is irrelevant when you have no other value.

  42. Mike A. says:

    $70 Biological viruses…
    $1 Environmental global warming…
    $0 Extraterrestrial invasion…
    $5 Governments abusive power…
    $8 Nanotechnology gray goo…
    $1 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $10 Space Threats asteroids…
    $1 Superintelligent AI un-friendly…
    $4 Other
    $100 million total

  43. Guy W. says:

    None, or maybe the technology one — What are we going to do when the Singularity occurs?

    Okay, $100M for ramifications of the Singularity, because it WILL occur.

  44. Cardinals Nation says:

    $1 million toward threat identification research…

    … and once identified…

    $99 million for the pre-event party

  45. Trump says:

    ALL OF IT TO THIS: Governments abusive power…

  46. Michael E. Lopez says:

    First off, $100 million is a paltry sum. This is sort of like asking if I had a Super Soaker, which California wildfire would I attempt to put out. But in the spirit of good fun and abstract thought experiments where the $100 million is representative of the total resources in the world…

    $14 Biological viruses…
    –This is both pretty nasty and unlikely to give us a lot of response time once it becomes a problem. Preparation is key.

    $1 Environmental global warming…
    –Not sure it’s a problem, and not sure we can do anything about it if it is.

    $10 Extraterrestrial invasion…
    –The problem here is that research into repelling invasion is likely, in the absence of an actual invasion, to yield technologies that promote abuse of government power such as orbital weapons platforms. Nonetheless, hugely important.

    $0 Governments abusive power…
    – Huge risk, just not terminal in any sense of the world. Unpleasant as hell, though.

    $9 Nanotechnology gray goo…
    –Everyone’s aware of the theory, so the likelihood of it happening in the first place is small. Fighting it once we realize what’s happening — assuming the technology to cause it in the first place — seems trivial.

    $0 Nuclear holocaust…
    –We already did this one back in the 20th century.

    $0 Simulation Shut Down if we live in one…
    –It doesn’t matter, and philosophers do it for free.

    $24 Space Threats asteroids…
    –Caramba! This is the one we KNOW is going to happen.

    $18 Superintelligent AI un-friendly…
    –Unlike the NanoGoo, this one is likely to sneak up on us when we’re not ready for it, with the AI self-evolving without our input. SkyNet is also likely to take us down hard and fast once it makes up its mind. This one scares the piss out of me.

    $24 Other: Intergalactic travel and long-term artificial environments to avoid sun burnout
    –The brass ring of physics.

  47. Policraticus says:

    $50M Biological viruses…
    $1M Environmental global warming…
    $0 Extraterrestrial invasion…
    $40M Governments abusive power…
    $3M Nanotechnology gray goo…
    $0 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $6M Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $0 Other
    $100 million total

  48. celebrim says:

    Ok, lets assume that some wealthy donor(s) has created this foundation with the expressed purpose of insuring the safety of humanity.

    $0 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $0 Governments abusive power…
    $0 Nanotechnology gray goo…
    $0 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $95 Space Threats asteroids…
    $5 Superintelligent AI un-friendly…

    Ok, the reasoning.

    a) Biological viruses $0: A naturally occuring virus has almost no chance of being an existential threat to society that understands the causes of disease and basic hygene. If a virus that represented an existential threat were to occur, it likely would have already occured in man’s lengthy pre-history. Since we are still here, it’s fairly safe to say that it isn’t going to happen soon. This leads us to the next problem, which is that of a man made virus. Humanity has the capacity to create a existential threat virus right now — and in fact the Soviets made a few that I think qualify. The problem here is not that it isn’t a significant threat, it’s probably the single most significant threat on the list, but rather that its unlikely that a foundation is going to be able to do anything about it because it involves human behavior and that isn’t something you can control so easily. The temptation is going to be to spend money to influence human behavior like some George Soros foundation, but not only will that not work in this case, its actually likely to increase the threat because the foundation itself will appear threatening.
    b) Global Warming $0: Basically, as ‘a’. There is a threat only from a runaway greenhouse effect, and the risk of that is low. Protecting against this involves influencing human behavior, and the payoff per dollar is very low. Now if this was a business investment, renewable energy would be the way to go, but the assumption of the foundation will be that its better to invest in things that no one else is investing in.
    c) Extraterrestial invasion: The risk is extremely low, and our ability to protect ourselves against it is zero. So don’t waste the money.
    d) Governments abuse of power: Not an existential threat. It might make you uncomfortable, but that’s not the same thing. Moreover, it involves human behavior, so the money is going to be largely wasted.
    e) Gray Goo: There are nanotech risks, but grey goo is a very low existential threat. There is almost no concievable threat here which would not represent a higher threat from biologicals. Most speculation on nanotech involves alot of magical thinking. Interesting line of research, but not very productive as far as security goes.
    f) Nuclear Holocaust: Any sufficiently large nuclear exchange to provide an existential threat is beyond the foundation’s ability to influence. Otherwise, see the reasoning for man made biological weapons in ‘a’.
    g) Simulation Shut Down: The risk is extremely low, and our ability to protect ourselves against it is zero. Any civilization capable of simulating the universe is so far beyond us that we haven’t a chance. So don’t waste the money. Also, there is a small risk that agressive counter-measures against a shutdown would trigger one.
    h) Space Threat Asteroids: This is far and away the most attractive area to spend money on. The risk is low but real and present, quite unlike many of the very speculative risks in other categories. Moreover, the cost of defending against the risk is closest in match to our budget so there is actually a chance that our efforts would have value. Additionally, there would be a big payoff in cases of a non-existential threat of the same class (saving the population of a continent, let’s say). Moreover, the side effects of research in this area have tangible value for the survival and security of the race. We make progress toward being a space-faring race, and in the highly unlikely event we are not alone in the universe we make progress toward planetary security. Plus, cataloging and influencing near earth passing masses has long term economic value, as eventually we would like to park a few of these in orbit for use as raw materials, platforms, and habitats.
    i) Unfriendly AI: I almost threw all the money into space threats, because there is so much money in computers that the market is almost certain to address this issue once it becomes a near term problem. However, there is some window for influencing human behavior that I don’t think is present for the extant threats. Sure, we’ll never protect against the mad scientists and rogue states and human corruption, but we can influence ordinary opinion on the relationship between AI and humanity in a way to minimize the risk of dangerous AIs. And its important to do that, because the intuitive common sense approach to the social engineering of AIs is the most likely to create serious problems. I’m speaking of course if the idea of the natural rights of machines being the same as humans and that AI revolts are most likely if we ‘enslave’ them. If there is anything more likely to produce dangerous AI as the notion that we should make AI’s in our own image, I can’t think what it would be.

  49. John G says:

    $90 Biological viruses — Think terrorists or rogue nations, most likely some development on their part that got away from them. They die too of course which makes it existential. Since this is a self propagating problem once released, it is the most serious problem listed here.
    $0 Environmental global warming — Even if true, it is self correcting. If humans are causing the problem and it leads to their slow demise there will be fewer humans and the warming will stop.
    $0 Extraterrestrial invasion — Why would they bother?
    $0 Governments abusive power — Certainly possible but not existential. If it’s that abusive the government oppressors die too; like a parasite killing its host. Again, self correcting.
    $0 Nanotechnology gray goo — Nope, too far into the future if at all.
    $0 Nuclear holocaust — Terrorists/rogue nations again but not existential. The irresponsible types won’t have enough weapons to kill us ALL off. It’ll be messy though.
    $0 Simulation Shut Down if we live in one — If true we never “existed” in the first place.
    $10 Space Threats asteroids — Possible but very low probability compared to viruses.
    $0 Superintelligent AI un-friendly — The trip would take so long we’d probably disappear from some other cause first.
    $0 Other — We’ve got enough non-existential problems. Why go looking for others.

  50. steven says:

    $75M AI
    $25M Nano

  51. billadams says:

    Odd that no one is worried about saving a soul that will either spend eternity in joy or agony.
    catholicfundamentalism.com thinks that’s the primary goal of each of us, not a lot of largely imaginary excuses to take money from our neighbors “for their own good”.

  52. John Markley says:

    $10 Biological viruses…
    $1 Environmental global warming…
    $1 Extraterrestrial invasion…
    $65 Governments abusive power…
    $1 Nanotechnology gray goo…
    $1 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $20 Space Threats asteroids…
    $1 Superintelligent AI un-friend

  53. Tim says:

    15 Biological viruses…
    5 Environmental global warming…
    5 Extraterrestrial invasion…
    0 Governments abusive power…
    10 Nanotechnology gray goo…
    5 Nuclear holocaust…
    0 Simulation Shut Down if we live in one…
    40 Space Threats asteroids…
    10 Superintelligent AI un-friendly…
    10 Other — could be Heinlein’s Future History Prophet in the US, or the Islamic’s deciding the West was finally weak enough, or China dumping something in the seas that triggers a catastrophic die off of sea life.

  54. Ray Ciscon says:

    $25 Biological viruses…
    Disease has almost wiped out mankind before,
    so research here is a good thing.
    $00 Environmental global warming…
    Let Al Gore raise his own money in search for
    ManBearPig
    $00 Extraterrestrial invasion…
    Fun to think about, but not likely to happen
    $00 Governments abusive power…
    The 2nd Amendment takes care of this.
    $00 Nanotechnology gray goo…
    Like alien invasion, very unlikely to happen.
    $00 Nuclear holocaust…
    I’m convinced, and I hope I’m correct, that
    no “sane” government would ever use nuclear
    weapons in a manner that would cause a nuclear
    holocaust, and terrorists just won’t get access
    to enough to spark a nuclear holocaust
    $00 Simulation Shut Down if we live in one…
    A bit too solipsistic for my taste
    $70 Space Threats asteroids…
    The only threat that has actually happened before,
    so it MUST be taken the most seriously.
    $05 Superintelligent AI un-friendly…
    Put some toward this, just in case the Singularity
    comes.

  55. Chris says:

    I’m going to include bacterial alongside biological. Antibiotics are no longer working. The nightmare is just around the corner. $80

    $10 for the confluence of governments/abusive power and nuclear holocaust.

    $2 each for all of the remaining, except for global warming, which really is bullshit.

  56. Jay says:

    $20 Biological viruses…
    $10 Environmental global warming…
    $01 Extraterrestrial invasion…
    $19 Governments abusive power…
    $01 Nanotechnology gray goo…
    $10 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $30 Space Threats asteroids…
    $01 Superintelligent AI un-friendly…
    $08 Other

  57. Bill Martin says:

    75% Biological
    25% Space threats-Asteroids

    Most of the others are sci-fi or political flavours of the month.

  58. Sabreur says:

    $35 Biological viruses
    $30 Nanotechnology
    $25 Nuclear holocaust
    $10 Space Threats

    I thought it made the most sense to focus on immediate, terminal threats. Biological and Nuclear threats received attention on the grounds that such weapons exists right now, and are spreading. I see the threat of nuclear war as being less than the threat of biological outbreak, but gave it almost the same amount of funding because it appears that we are on the verge of finding workable solutions (interception systems, etc.) Biological weapons are a scarier proposition. Even natural viruses have had some impressive successes in wiping out large segments of humanity. An intentional release of a powerful (possibly genetically engineered) virus would be an exceptionally ugly thing. Research is needed on how to predict, identify, and ultimately contain outbreaks.

    Nanotechnology received similar attention on the grounds that it is almost here. I’m not worried about a ‘gray goo’ scenario. Nanomachines will need energy just like anything else — a self-replicating nanomachine won’t turn the world into gray goo any faster than a self-replicating bacterium will turn it into green goo. That being said, I am worried about the possibility of nano-engineered weapons, like a nano-plague. While not an ‘immediate’ threat, it strikes me as a strong potential threat and one that could be mitigated through careful study.

    Space threats received a little bit of funding, since it seems that the odds of us being obliterated by an asteroid or similar cosmic event are far below the odds of us obliterating ourselves with a bioweapon or some such device. That being said, it’s pretty much inevitable that our planet will get a celestial beatdown at some point, so it make sense to put some funding in this direction.

    Government abuse received no attention for a few reasons. For starters, while it is undoubtedly a bad thing, I don’t see it as a civilization ending threat. Also, I don’t see it as a threat that would be dampened much by research and expenditure. The means to prevent government abuse are already widely known, just inadequately exercised. A study won’t change that.

    Global Warming received no attention on the grounds that is has received far too much attention already. Also, it doesn’t seem to have much humanity-ending potential. While a major climate shift would be a bad thing, it didn’t even come close to the worse things on the list.

    The “Super AI” threat received no attention on the grounds that we are too far away from developing any AI to worry about this just yet. We don’t need a technological breakthrough to achieve a working AI, we need several thousand technological breakthroughs. Yes, there have been some clever programs and robots which can perform rudimentary learning — but we’re a long way from HAL 9000.

    Other threats received no funding on the grounds of being perceived as irrelevant compared to the main threats listed above.

  59. blogassault says:

    $30 biological
    $30 hunger
    $20 terrorism
    $20 putting a stop to the democratic party and George Soros

  60. Pingback: AARON
  61. Bryan says:

    $40 Biological viruses…
    $05 Environmental global warming…
    $00 Extraterrestrial invasion…
    $25 Governments abusive power…
    $10 Nanotechnology gray goo…
    $05 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $10 Space Threats asteroids…
    $05 Superintelligent AI un-friendly…
    $?? Other

    Invasion and Space threats are quasi-related — research in one field would benefit both.

    Expenditures in Gov’t should include education (without indoctrination), developing representative democracies where none exist (Asia, Africa, Middle East), and promoting indivdual freedoms in nations that are democracies in name but less so in practice (EU, Central and South America).

  62. Bart Hall (Kansas, USA) says:

    What would I do with $100 mil?

    100% to viruses.

    Specifically, flash genomic mapping techniques (for the viruses) tied to rapid vaccine development, production, and distribution. For example, syrette slap-boards distributed as top-priority bulk mail through the US Postal Service.

    Second choice would be asteroids, but the risk is vastly lower, and much less pressing. Besides, there’s almost nothing you could do in that domain, even with the entire amount.

  63. ajacksonian says:

    Operative words: both global and terminal.

    $0 million Biological — This is not against avian flu, anthrax or any standard biological disease, each of which has a survival percentage built in. A designed disease that is omnispreadable and 100% fatal would also need environmental persistance, high contagion rate and multi-drug resistance, be highly contagious and offer no immunity to the body’s standard defenses. Mankind survives the natural and anyone smart enough to make a disease will make one that you can’t prepare for save for being off-planet.

    $55 million Environmental threats — $0 million for GW. The rest for megacaldera events, like Yellowstone. Rock 3 from the Star Sol has survived higher carbon dioxide, water vapor and methane simultaneously by large amounts, and we are here to prove that something else is required for terminal global warming. That leaves the big events that can interrupt the crop cycle for a decade: megacaldera events. In theory there is a slim chance that some humans will survive this, but without a food source for that long, that will be scattered, at best. Beyond off-planet survival, the ability to find long term, stable storage of food would be necessary, plus abundant crop yield. Or creation of an alternate food source via nanotech/biotech. These are worthwhile as are the disater preparedness for sub-megacalderas that only affect climate for a few years.

    $0 million ET invasion — if they are hostile and space capable, then you cannot prepare for them, save by being off-planet. That goes for Saberhagen’s Berserkers doubly so.

    $0 million Government — in theory a good idea, in practice? We could try practicing the Law of Nations and get away from a touchy-feely reach out and control everyone transnational state… but such a state is not terminal by definition. Global only if we aren’t off planet and if we *are* no government will stop that.

    $0 million Nanotech — no treaty has stopped the spread and development of chemical weapons. The antrthrax letters point out the same to bio attacks. Nanotech? Lotsa luck… if you can make everyone behave you already have a totalitarian state… off-planet works fine if things go wrong, however.

    $0 million Nuclear — possibly both global and terminal, but not likely as stockpiles reduce. Survival for this looks a lot like megacaldera events and need similar planning. Or, being off-planet.

    $0 million Simulation Shutdown — I witness the lack of a ‘complaint box’ or ‘suggestions box’ in our reality. If it is a simulation, they don’t care. Besides, what if it was just a stoppage to shift to a new platform? We would *never* notice it.

    $45 million Space threats — a large boloid ~10km across can really ruin a biosphere. Likewise a relatively small dust cloud. Can’t do much for a Gamma Ray Burst. This is worth doing and moving beyond feasability and into engineering and getting systems up and running for boloids and dust clouds. Getting off-planet is a *great idea*, too.

    $0 million Superintelligent AI — It will be by Microsoft and crash. Seriously, like the bio/nano threats exactly *how* do you prepare for this? And by the Kurzweil analysis, that AI will most likely be an integrated human/AI system. Now if it goes Terminator or Berserker, exactly *how* do you counter that ahead of time? Getting off-planet with systems not integrated with Rock 3 would be a great idea…

    $0 million Other — one can postulate them, like the heat death/dark energy ripping of structures at all scales, but those are both in the billions of years away scale. We should be so lucky as to have to consider them.

    Basically this Planet is a killer and it is time to get off of it. It has had multiple large scale extinctions *without* boloids and one with a Gamma Ray Burst… plus numerous lesser extinctions that only wipe out 30–35% of all species. A few of *those* since the end of the dinosaurs… either take those threats seriously or get ready to move to a new space home. 95% of all species are *extinct*, and I don’t trust a planet that likes to do that. It is a great starter home! We have outgrown it.

  64. Stan T says:

    $100 million for the study of irresponsible nongovernmental organizations as a threat to humanity. The rest our government can do something about if need be.

  65. David Aitken says:

    $25 Biological viruses…
    $0 Environmental global warming…
    $0 Extraterrestrial invasion…
    $25 Governments abusive power…
    $15 Nanotechnology gray goo…
    $25 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $0 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $10 Other
    $100 million total

  66. jmadams says:

    $45 Biological viruses…
    — This is probably the only realistic threat to human life on our planet. With the ability to travel from very remote parts of the earth to high population areas in a very short period, the posibity of a highly contagious and deadly virus, is a very real and deadly threat. The posibility of a terroristic use of such a virus is also a very dangerous threat.

    $0 Environmental global warming… This is a non-threat, The earth has been warming and cooling, since its very exsistence. Humans are highly adaptable creatures, if our ancestors could survive the ice ages, I’m sure we can tolerate warmer weather.

    $0 Extraterrestrial invasion… Another Non-Threat, placing resources in this area would be a waste, because of the highly unlikly probablity of such a threat…(If they can travel interstellarly, I very much doubt even all of the 100M into this field would help us very much to compete againt a race with such a high level of technology.)

    $10 Governments abusive power… Providing funds into this field my help eliminate rogue states.

    $25 Nanotechnology gray goo… Moderate threat, think it’s decades away from becoming dangerous, but the possibility exsists.

    $0 Nuclear holocaust…

    $0 Simulation Shut Down if we live in one…

    $15 Space Threats asteroids… Minimal threat, however it is a real threat, we are bombarded daily with asteroids and he have been hit with earth killers in the past…

    $5 Superintelligent AI un-friendly… nominal threat, but possible maybe in the next few decades, losing control of our networks to such a threat could be devistating

    $100 million total

  67. Jimmy Hogan says:

    $25 Biological viruses…
    $5 Environmental global warming…
    $1 Extraterrestrial invasion…
    (residual benefits)
    $50 Governments abusive power…
    (especially US municipalities who have learned the little trick of exchanging basic liberty for cashflow)
    $?? Nanotechnology gray goo…
    $1 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $1 Space Threats asteroids…
    $0 Superintelligent AI un-friendly…
    $17 Other
    (so much can be done with so little toward basic nutrition etc… all the other things are mostly emotion — I wonder how many copies of Atlas Shrugged and The Fountainhead you could include in this category as well)

  68. Robert Ford Mashburn says:

    Government #1 20%
    Biological #2 20%
    Nuclear #3 10%
    Nanotech #4 10%
    Enviro #5 10%
    Space #6 5%
    Super AI #7 5%
    Alien Inv #8 5%
    Simulation #9 5% Cyberwarfare?
    Other #10 10%

    I rate Government as the highest existential risk because I equate freedom and liberty with life. Life without freedom and liberty is not worth living. Totalitarian government is a greater risk to our existence than the rest of the list.

    Biological warfare doesn’t need to be a direct attack on humans, it could be an attack on our food supply. Imagine a “bird flu” introduced into the US poultry industry or a widespread outbreak of Mad Cow disease. Imagine an aggressive mold spore designed to attack wheat or corn. We should devote more resources to investigate and understand the risks.

    Nuclear would cover more than just nuclear war, it would also cover the risks and the amelioration measures to nuclear contamination of our water supply or habitable land areas. What happens when domestic terrorists penetrate and destroy a US nuclear facility? If Al Qaeda can coordinate air attacks with multiple aircraft and sleeper cells across the USA why couldn’t Earth First/ELF do the same thing with US nuclear facilities?

  69. KenB says:

    $35 Biological viruses…
    $15 Nanotechnology gray goo…
    $05 Nuclear holocaust…
    $25 Space Threats asteroids…
    $20 Superintelligent AI un-friendly…

  70. Mike D says:

    OK, I’ll take a shot at this…

    0$ Biological threats. Even though this is one of the most realistic threats depicted here, I think adequate R&D funding is deployed towards this issue already. Everyone likes to demonize big pharma companies but meanwhile if some deadly microbe ever emerges that threatens to wipe out the human population, the most likely source of a solution is going to be Merck or Novartis etc., not some independent research group.

    0$ Environmental / Global Warming. Again, even though this is a valid concern, lots of research is already being funded in this direction. Furthermore, I think we would avoid really disruptive climate change even if we weren’t aware of the potential for global warming at all. The rapid industrialization of China and India is going to drive up the costs of fossil fuels so greatly that market forces will take care of this problem by themselves. Technology already exists to practically eliminate fossil fuel use altogether; once oil gets too expensive to use as fuel, this technology will be applied widely.

    $40 Asteroids. I think this danger is more amenable to a technological solution than it looks. If we can develop detection technology and somewhat improve our ability to calculate orbits (supercomputers), potential impactors can be detected early enough to require only very small course corrections to avoid collision. This I believe is a solvable problem and also one that requires dedicated research funds, since private industry has no incentive to research it.

    0$ Hostile Extraterrestrials. First of all it seems very unlikely that any advanced species we encounter is going to be hostile, and secondly, if they were, we probably would have no chance of defending ourselves. A spacefaring alien civilization would likely be millions of years more advanced than us. Imagine a group of rhesus monkeys trying to defend themselves against a modern infantry battalion. No reason to waste time on this. I think it would be cool to put more resources into searching for signs of intelligent life on other planets — but that doesn’t seem to fit into this category. We of course have no idea how rare life is throughout the universe but it is at least possible that it is quite common.

    0$ Nanotech — (grey goo). It is not impossible that this sort of thing will become a concern in the future, but we’re not there yet. Until nanotechnology is much further developed it is too speculative to worry about this now — and likely not very productive anyway. It would be like scientists in the 1860’s trying to prepare for the societal changes that would be wrought by the then impending revolution in communications technology — they would have been shooting in the dark and wasting their time.

    0$ — Nuclear holocaust. Governments all over the world, especially the United States, are already devoting alot more than a piddly 100 million dollars to devising ways of defending against nuclear weapons.

    0.5$ — Simulation / Hostile AI. I think it would be useful to set up a small research group of psychologists to investigate how people can be fully grown adults and yet not be able to handle watching scary sci-fi flicks. Maybe they can figure out exactly what percentage of the population shouldn’t be allowed to rent the Matrix 6 or Terminator 12 or whatever it’s up to.

    $10 — Government abuse of power. In some sense the US defense establishment is already putting out something like $400 billion dollars a year to defend against this kind of problem — but I think it would be a good use of research funds to investigate technological aids that individuals can use to help defend themselves against abusive governments. How cool would it be to develop a gadget that Chinese citizens could use to circumvent their government’s internet censorship, for example?

    $49.5 — The remaining funds could be used to look into ways of addressing other potential catastrophic events that no one else is likely to look into meaningfully. Is there a way to prevent the colossal volcanic eruptions that have possibly caused global extinction events in the past? How much methane hydrate is sequestered on the ocean floor, and what is the potential that huge quantities may be released by some kind of seismic event? What were the causes of each of the various near-apocalyptic extinction events that crop up in the fossil record anyhow?

  71. SMSgt Mac says:

    $1M Biological viruses [By improving early warning and mitigating/preventative actions]
    $1M Environmental global warming [To study ways to exploit its benefits since we can’t do anything about climate change anyway]
    $0 Extraterrestrial invasion [If “they” can get here, what are we going to do to stop it?]
    $34M Governments abusive power…[3/4 to subvert totalitarian regimes and promote free markets abroad, and 1/4 to teach American History, Civics, and the Constitution in the U.S.]
    $0 Nanotechnology gray goo [Free market will take care of this]
    $1M Nuclear holocaust [By Adding 1% to a baseline 6+% GNP DEFENSE and Intelligence covert action budget]
    $0 Simulation Shut Down if we live in one [E.T. Quote: “This is REALITY, Greg”]
    $.05M Space Threats asteroids [High Risk (low Probability & High Concsequence) easily mitigated through current technology and development pace]
    $0 Superintelligent AI un-friendly [Free market will take care of this too]
    $62.95M “Other”: To be allocated as needed to educate the American public on the nefarious ways in which Non State Actors (Including the United Nations) attempt to subvert the American Republic on behalf of despots, tyrants and utopian fantasists, with special attention to:
    1. self-important celebrity ‘activists’,
    2. discredited political and social movements such as socialism, fascism, and communism
    3. adaquate mental health care for the paranoiacs and mentally deficient among 1 & 2.(likely to be the biggest slice of the budget pie)
    $100 million total

  72. $5 Biological viruses…
    $2 Environmental global warming…
    $1 Extraterrestrial invasion…
    $54 Governments abusive power…
    $3 Nanotechnology gray goo…
    $18 Nuclear holocaust…
    $2 Simulation Shut Down if we live in one…
    $8 Space Threats asteroids…
    $3 Superintelligent AI un-friendly…
    $4 Other

    Most of the other threats are contingent on the will to use them. Abusive governments, particularly fascisms which exist as exercises in focussed will unbounded by rational feedback provide the will to develop all the other threats save ET invasion. Cold war fantasies to the contrary, accidental nuclear conflagration is unlikely, but deliberate attacks are a certainty.

  73. Jeff Peterson says:

    $30 mil — nuclear proliferation
    $30 mil — biological/viruses
    $20 mil — Artificial Intelligence
    $10 mil — space threats
    $10 mil — other (misc.)

  74. Junkyard God says:

    $25 million: Biological viruses…
    $05 million: Environmental global warming…
    $00 million: Extraterrestrial invasion…
    $50 million: Governments abusive power…
    $01 million: Nanotechnology gray goo…
    $15 million: Nuclear holocaust…
    $00 million: Simulation Shut Down if we live in one…
    $04 million: Space Threats asteroids…
    $00 million: Superintelligent AI un-friendly…
    $?? Other

  75. Bart Hall (Kansas, USA) says:

    Jacksonian … my first two degrees are in geology, and I’ve worked the Yellowstone caldera. Don’t waste a lot of energy worrying about that one. The last event dropped maybe an inch of ash in places like Iowa, Missouri, and Kansas (downwind).

    That would have been messy, but it actually improves the soil if incorporated. Timing (summer or winter) of such an event could make a one-year difference in the prices of stuff like corn and beans, but it wouldn’t be remotely close to disastrous. I earn my living as a farmer, so I’d be affected, but I’m not really worried. A bad drought is a worse threat, and much more common.

    We’d have a couple of cooler seasons — think 1815–16 — but it wouldn’t classify as a disaster, and most certainly would not threaten a significant percentage of human life.

  76. Dave Trauger says:

    $20 Biological viruses…
    $00 Environmental global warming…
    $05 Extraterrestrial invasion…
    $10 Governments abusive power…
    $05 Nanotechnology gray goo…
    $35 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $30 Space Threats asteroids…
    $05 Superintelligent AI un-friendly…
    $?? Other

    Of course Govenments abusive power, and Nuclear holocaust are redundent in this context.
    $100 million total

  77. Sam Carlsson says:

    No point to worrying about any existential threats to mankind as long as the media can get wind of it. They, and some politicians, would spin certain doom, gloom and disaster until none of us would even want to survive.
    The current example of this kind of alarmist blather is the reporting on climate change that blames mankind and specifically the United States for heating up the planet with carbon emissions, all the while disregarding the immense effect of an over-active sun for the past twenty years.
    So put the money into a ‘Free Enterprise Institute’ that would use small sums of money to create incentives for citizens of under-developed countries to learn how to feed, clothe, and shelter themselves.
    Instead of building gargantuan government programs that ignore individual human potential, encourage the individual people to build their countries up — from the inside out.

  78. Steve Barkmeier says:

    $18.9 Biological viruses…
    $0 Environmental global warming… There is very little chance of global warming completely destroying humanity. Even if the recent warming is due to humanity (far from a proven proposition), a small change in climate would make some areas more habitable and some areas less. It would not destroy humanity.

    $100 Space Threats asteroids… This is the one item on the list that is a real threat and that $100 million might make a difference on addressing.

    $0 Superintelligent AI un-friendly… What exactly is this money going to be spent for? At this point, we don’t have an existing threat. It would be a waste of money to try to investigate something that doesn’t exist.

    $0 Nanotechnology gray goo… Once again, how are we going to spend money to avoid this problem? There is no present nanotechnology that threatens humanity. Until such technology exists, there is nothing to investigate.

    $0 Nuclear holocaust… How is a private foundation going to control the actions of all the governments of the world to prevent this. There is enough understanding of the potential for destruction from the cold war. I don’t see how money could possibly be spent in this area in a way that could help.

    $0 Other– Other What?

    $0 Governments abusive power… Governments that are given too much power will always abuse that power to oppress their population. However, unless there is a government over the entire earth, none of these governments pose a threat to all humanity through their oppression.

    $0 Simulation Shut Down if we live in one… This is nutty at best. Even if it wasn’t, how are we going to spend the money that would help the problem?

    $0 Extraterrestrial invasion… This is another nutty category. In addition, if it was a real threat, weapons research would be the best way to address the problem. $100 million more is not going to make any noticable impact on the development of weapons system.

    $100 million total

  79. $30 Biological viruses…
    $05 Environmental global warming…
    $20 Space Threats asteroids…
    $05 Superintelligent AI un-friendly…
    $30 Nanotechnology gray goo…
    $01 Nuclear holocaust…
    $04 Other
    $04 Governments abusive power…
    $0.01 Simulation Shut Down if we live in one…
    $0.1 Extraterrestrial invasion…

    I put the money where I thought the most serious threats likely are as existential threats, but also where they are likely to do good.

    Global warming is a series of related propositions that get collapsed into one: (1) Is it real; (2) is any of it human-caused; (3) which part and how much; (4) how bad are the consequences–mild, moderate or severe; (5) can we do anything; and (6) what are the consequences of our “solutions” as compared to inaction? Too many ifs before we get to a terminal threat, seems to me.

    Nuclear holocaust is awful, but how imminent a threat is a global threat? My sense is it is more likely that one or several will be used, and not all at once but over time? That’s awful, but seems to me that’s a less severe existential threat than biological or nano threats, or one, well-aimed asteroid; but perhaps more probable.

    As far as abusive government…the most compelling rationale for more powerful government is precisely these other threats; who said, “war is the health of the state?”

    Finally, I have a feeling if we do all these other things, we’ll be better prepared, if at all possible, for invading ETs; otherwise, we have to handle them with what we have.

    Does this count as “voting”? It says only 18 votes above, but many more comments.

  80. NikFromNYC says:

    How exactly will my $10 million be spent on the threat from UFOs?

    Simulation shutdown? Since the STUDY itself (i.e. threat of discovery of the kid running his year 90,000 Lego Mindstorms…illegally hacked to produce free-will creatures) would be the most likely reason *for* shutdown, or at least vaporization of Earth before we use his hyperdube PC to phone his parents, I give this $0. It may be the very reason why we can’t locate any radio signals even from alien societies who are a million or billion years more advanced than we are. Something zaps them! This could be the “singularity” point in which our Boy Wonder finally figures out how to blow up cities with a tiny hack to their nuclear powered iMind pocket Net connection.

    Global Warming? If you give $20M for that, then damn it, give $20M likewise to the threat of catastrophic COOLING, since this PhD chemist has finally looked into the “evidence” and have a few years ago lived (in lab) for twenty feet from Harvard’s pathetic (located as windowless cubical offices in a skyway tunnel between the monstrous chemistry and geology buildings!) climate modeling department, might I say that those guys were usually not even there, since the simulations take so long, and were not very friendly normal guys anyway. Shifty-eyed hippies in white jackets and blue jeans, and no sense of humor. 95% of Antarctica is COOLING. That’s why media images of bigger and bigger icebergs breaking off have been appearing (oh the horror!).

    The ozone hole was real, and is real, and quick and severe action was taken. You could see an animated picture of it getting bigger each year, and point out real (bad) consequences. Here, real environmentalism kicked in, and only in tiny specialty markets (like acrylic solvent-welding or chemical research catalogs) can you buy chlorinated hydrocarbons, whereas it used to be used in every home, auto and industrial air-conditioner, refrigerator, dry cleaner shops, and many chemical plants that make medicines or plastics. That case created *very* little “controversy”, and even though some smaller GNP countries still use the stuff to save money, the hole is now predicted to heal itself in a few decades. NASAs budget saved the earth, literally.

    But “global warming”? There isn’t even such a thing as a “global” climate. In some places it gets hot, and in others cold. So you take a hundred places on Earth and average them. Fine. But the fact that sunspot activity correlates EXACTLY with temperature fluctuations (and indeed we had a recent peak followed by the last near decade of no change at all). Carbon Dioxide (one of the most insignificant by far “greenhouse” gasses vs. methane or water vapor) LAGS this temperature change, so is most likely due to the extremely high heat potential of water (it’s much slower to boil than other liquids that boil at the same temperature…then much slower to cool down when set aside), it just takes the very deep oceans a long time to heat up due to a more active sun, thus releasing dissolved CO2 (ocean = salty soda fiz water). And it’s not well understood either, since it’s necessarily the increase in the sun’s brightness, but more to do with how cosmic rays nucleate clouds, and those COOL the world, but the sun creates an electromagnetic blanket (wind) that works much like an old TV set uses magnetic fields to scan the ionized image forming beam back and forth, in this case the “beam” is charged cosmic rays that makes trips to Mars a bit of a drag since cosmic rays quickly cause cancer as they snip DNA.

    Global warming is both a form of New Age Religion and yet another attempt by Marxism to apply itself, so that the USA has to give money to under-developed countries with smaller GDPs. Kyoto is a HUGE tax on the USA which will result in about a 1 year slowing of “predicted” (by models that arrogantly presume to predict something akin to modeling the human brain) warming in 100 years.

    So the money should go to real threats like 1/10th to asteroids and 1/10th to disease (non-politicized hard science), and 8/10ths of it towards solar and newer, safer nuclear power, as well as higher efficiency combustion engines, and waste-product-to-liquid-fuel technologies, and eventually FUSION.

  81. NikFromNYC says:

    How exactly will my $10 million be spent on the threat from UFOs?

    Simulation shutdown? Since the STUDY itself (i.e. threat of discovery of the kid running his year 90,000 Lego Mindstorms…illegally hacked to produce free-will creatures) would be the most likely reason *for* shutdown, or at least vaporization of Earth before we use his hyperdube PC to phone his parents, I give this $0. It may be the very reason why we can’t locate any radio signals even from alien societies who are a million or billion years more advanced than we are. Something zaps them! This could be the “singularity” point in which our Boy Wonder finally figures out how to blow up cities with a tiny hack to their nuclear powered iMind pocket Net connection.

    Global Warming? If you give $20M for that, then damn it, give $20M likewise to the threat of catastrophic COOLING, since this PhD chemist has finally looked into the “evidence” and have a few years ago lived (in lab) for twenty feet from Harvard’s pathetic (located as windowless cubical offices in a skyway tunnel between the monstrous chemistry and geology buildings!) climate modeling department, might I say that those guys were usually not even there, since the simulations take so long, and were not very friendly normal guys anyway. Shifty-eyed hippies in white jackets and blue jeans, and no sense of humor. 95% of Antarctica is COOLING. That’s why media images of bigger and bigger icebergs breaking off have been appearing (oh the horror!).

    The ozone hole was real, and is real, and quick and severe action was taken. You could see an animated picture of it getting bigger each year, and point out real (bad) consequences. Here, real environmentalism kicked in, and only in tiny specialty markets (like acrylic solvent-welding or chemical research catalogs) can you buy chlorinated hydrocarbons, whereas it used to be used in every home, auto and industrial air-conditioner, refrigerator, dry cleaner shops, and many chemical plants that make medicines or plastics. That case created *very* little “controversy”, and even though some smaller GNP countries still use the stuff to save money, the hole is now predicted to heal itself in a few decades. NASAs budget saved the earth, literally.

    But “global warming”? There isn’t even such a thing as a “global” climate. In some places it gets hot, and in others cold. So you take a hundred places on Earth and average them. Fine. But the fact that sunspot activity correlates EXACTLY with temperature fluctuations (and indeed we had a recent peak followed by the last near decade of no change at all). Carbon Dioxide (one of the most insignificant by far “greenhouse” gasses vs. methane or water vapor) LAGS this temperature change, so is most likely due to the extremely high heat potential of water (it’s much slower to boil than other liquids that boil at the same temperature…then much slower to cool down when set aside), it just takes the very deep oceans a long time to heat up due to a more active sun, thus releasing dissolved CO2 (ocean = salty soda fiz water). And it’s not well understood either, since it’s necessarily the increase in the sun’s brightness, but more to do with how cosmic rays nucleate clouds, and those COOL the world, but the sun creates an electromagnetic blanket (wind) that works much like an old TV set uses magnetic fields to scan the ionized image forming beam back and forth, in this case the “beam” is charged cosmic rays that makes trips to Mars a bit of a drag since cosmic rays quickly cause cancer as they snip DNA.

    Global warming is both a form of New Age Religion and yet another attempt by Marxism to apply itself, so that the USA has to give money to under-developed countries with smaller GDPs. Kyoto is a HUGE tax on the USA which will result in about a 1 year slowing of “predicted” (by models that arrogantly presume to predict something akin to modeling the human brain) warming in 100 years. Yet Carbon Dioxide is plant food, and though a few fanatical researchers have found specific plants that suffer its doubling, your garden variety forrest plant or tree loves it, which is why trees in Central Park grow almost twice as fast as those in New Jersey. There are the equivalent of two highways (of Ford Taurus yellow cabs mostly) running through the Park, spitting out plant fertilizer. In other words, we should cripple the economy of the USA in order to stop the greening of the Earth, and to lower available research money for alternative energy and viral research? Can anybody say “doomsday cult”?

    So the money should go to real threats like 1/10th to asteroids and 1/10th to disease (non-politicized hard science), and 8/10ths of it towards solar and newer, safer nuclear power, as well as higher efficiency combustion engines, and waste-product-to-liquid-fuel technologies, and eventually FUSION.

  82. Johnny Dunn says:

    Most of the things listed pose no real and only minimal potential existential threat. The only effective method of insuring our survival is to create an off-world civilization. Government will never do this. I would put the entire $100m into private space exploration. As soon as we find something commercially valuable, the race will be on in a major way. One existential threat not listed was the militarization of space. It is going to happen. If you don’t think so, pick up a history book. The only question is who will control access to space. Right now I consider it most likely that China will be the one to do it. I doubt that the rest of the world will like that scenario much, but we in the west are so busy spending our political resources bickering with each other, that we are ignoring immediate threats that could have a long term impact. Large space objects and megacaldera events are the only listed events that have a history of nearly killing us all off.
    It is gratifying to find so many rejecting AGW as being a major problem.

  83. Steven says:

    $100 million: Nuclear holocaust…i.e., a missile/satellite defense shield.

    The rest is not the province of government (& in some cases, not the province of reality), which is to defend the nation & to establish/maintain law enforcement agencies & courts (civil & criminal) to defend the rights of the individual citizen.

    (Allocating government funds to government in order to defer government tyranny is like a bank robber giving himself a raise from his stolen loot to mount a marketing campaign to sell bank robbery insurance to Citicorp.)

    The legitimate issues (medicine, technology, even asteroid avoidance systems, etc.) can–in a genuinely capitalist system–be handled by business. And successfully so. By government, not at all.

    As for AI (impossible) & global warming (a fraud), mirages sure are compelling, aren’t they?

  84. Fat Man says:

    $20 Biological viruses…

    Disease is an ever present problem. Has any disease pandemic ever killed more than a third of a population? What portion of the indigenous population of the Americas was wiped out by pandemics? The book “1491” claims it was as much as 95%, but Hugh Thomas in Conquest adopts a lower percentage. How can pandemics be isolated and defeated before they cause excessive mortality?

    $20 Environmental global warming…

    Money should be given to Bjorn Lomborg who wants to solve problems, not spread hysteria.

    $0 Extraterrestrial invasion…

    Oh come on. The possibilities are endless, but the probabilities are nil. Stop reading Sci Fi novels.

    $20 Governments abusive power…

    Remember the end of Orwell’s 1984 and O’Brien’s prophecy: “If you want a picture of the future, imagine a boot stamping on a human face -— forever.”

    The problem here is not what is happening in the US, as much as the detractors of the current administration love to wax hysterical about it, things will change in a year, and they will have to find a new game to play.

    For much of the world, the alternative to the modern open access order like the United States or the EU, is a descent into the hell of disorder, or the even blacker hell of absolute tyranny. Read the work done by Nobelist Douglass North, and his colleagues Wallis and Weingast, “The Natural State: The Political-Economy Of Non-Development”.

    How do we promote open access order and avoid the Natural State?


    $0 Nanotechnology gray goo…

    Wait until the nanotech people have produced some results beyond grant requests.

    $10 Nuclear holocaust…

    This should be viewed as part of the same problem as the Natural State above.

    $0 Simulation Shut Down if we live in one…

    People who worry about this need more sun shine and fresh air, not encouragement.

    $20 Space Threats asteroids…

    Detection of possible problem asteroids needs to be systematized. Techniques for changing asteroid orbits need to be established and practiced before they are really needed.

    $0 Superintelligent AI un-friendly…

    I think you should wait until there is any sign of anything resembling AI, like a computer that can warn you before it crashes.

    $10 Other

    What are the real problems that the fake ones, like space aliens, keep us from worrying about.

    $100 million total

  85. don wilkins says:

    The most imminent threat is government abuse of power. If the government would focus on areas that it should, resources could be devoted to serious problems rather than frittering it away on nonsense.

    Any other problem can be solved by free, educated people working outside of the limitations of a government. Put the whole amount into fighting government abuses.

  86. Daedalus Mugged says:

    I believe that at the $100mm level, the results when broken up would be meaninglessly small, so I shall speak in percentages.

    Risks fall into two catagories, those we create and those we are a victim of. Of those, I think those we create have far greater potential and likelihood of becoming an existential threat.

    The largest being biological. I wouldn’t limit it to viral, I believe bacteria also can be a massive threat. Genetically engineering opens the door to resistant superbugs. Evolutionary pressures limit the lethality of bacteria (if it kills the host to efficiently and/or too quickly, it destroys its own environment). However, those limits may be fairly easily avoided in the lab. A bacteria will a 99% lethality in 3 days would be an evolutionary failure and never occur naturally, but an existential threat if engineered. Thus either bacteria or viral threats are potentially huge. An equivalent of an AQ Khan on the biological front as opposed to the nuclear front is both a great probability and a greater likelihood of use given much time. Secondarily, resources dedicated to minimizing the existential threat have large positive spin off value in dealing with non-existential daily problems. 40% allocation.

    We are likely to face an Al Qaeda controlled nuclear Pakistan and Crazy Mullah controlled nuclear Iran in the immediate future. And Kim will gladly sell his technology if it helps him hang onto his Magic Kingdom of Hell. We do face immediate nuclear threats, although perhaps not rising to existential levels for about a decade. How many nukes does it take to make the world effectively unsurvivable? I am confident that Iranian oil money, Paki and Korean know how can get to that number in a matter of years, not decades.
    20% allocation.

    Nanotech: 5% allocation
    Scanning for space threats (alien or inanimate): 3% allocation. (I don’t believe there is much we can do about it, especially given the varied form of threat, but need to know as soon as possible to dedicate appropriate resources once imminent)

    Other: 2% allocation dedicated to identifying other threats.

    The clever might note that only adds up to 70%. We only get to be wrong once as long as we are all on this rock. Regardless of the source of the existential threat, one of the most potent responses is to have more than one chance to get it right. 30% allocation dedicated to getting a viable independant colony established. Probably initially dedicated to improving our ability to escape our gravity well for low cost, whether space elevator, or improved rockets or something else.

    40% biologic
    20% nuclear
    5% nano
    3% identifying space threats
    2% identifying additional threats
    30% getting us off this rock, so we get at least 2 tries against any existential threat.

  87. Steve Reynolds says:

    $10 Biological viruses

    $5 Extraterrestrial

    $5 Nanotechnology
    $5 Nuclear holocaust

    $50 Space Threats
    $25 Superintelligent AI

  88. greywar says:

    $30 Biological viruses…
    $25 Nanotechnology gray goo…
    $25 Space Threats asteroids…
    $10 Superintelligent AI un-friendly…
    $10 Nuclear holocaust…
    $0 Governments abusive power…
    $0 Simulation Shut Down if we live in one…
    $0 Other
    $0 Extraterrestrial invasion…
    $0 Environmental global warming…

    Let me know whenever any study is able to limit govenrment power and maybe it would get some cash, until then it would just be an exercise in wasting cash to study it.

  89. Miriam123 says:

    I completely agree with Paul Tozour and completely disagree with Michael Anissimov. We would do better to focus on the root causes of terrorism (a generic term for the ability of the few to wreak chaos and fear on the multitudes), rather than focusing simply on the weapons used to do so. Saying that political unrest, etc is not what should be focused on ‘because it has been around since the dawn of time’ is ludicrous. Say that about disease, poverty, etc and see what kind of reception that would have amongst thinking people. I just finished a dissertation on terrorism as a complex system (ala systems theory) and there are many things (militarily, security, ideologically, culturally, economically, etc) that can be done to lessen the effectiveness and appeal of violent radicalism. Focusing on the means of delivery (types of weaponry) is a small facet of a multifaceted issue.

  90. Dave Coffin says:

    You left out a big one: Demographic decline. There’s little danger of losing the entire human race over this as long as women in Mali and Yemen keep having eight babies apiece, but the civilization we enjoy is in serious trouble. Nowhere outside of Africa and the Islamic world are people having enough babies (the future soldiers, workers, and taxpayers) to sustain their aging populations. The generous welfare states of Europe will collapse in bloody ethnic strife, Japan will decline further as old folks desperately hoard their money, and China and Thailand may get old before they get rich. In the USA, the Bible Belt will grow wider and more populous as the secular, feminist, Gaia-loving elite slowly dies off. On the plus side, wolves are already making a dramatic comeback in the depopulated rural areas of Eastern Europe.

  91. John Biddle says:

    $50 Biological viruses…
    $30 Space Threats asteroids…
    $05 Nuclear holocaust…
    $05 Nanotechnology gray goo…
    $05 Superintelligent AI un-friendly…
    $05 Governments abusive power…
    $0 Simulation Shut Down if we live in one…
    $0 Other
    $0 Extraterrestrial invasion…
    $0 Environmental global warming…

  92. mockmook says:

    How about 100% on far distant space colonization.

    We don’t know what might end life here, so plan to have some of us elsewhere.

  93. $15.0 Biological viruses…
    $20.0 Space Threats asteroids…
    $5.0 Governments abusive power…
    $20.0 Nanotechnology gray goo…
    $5.0 Nuclear holocaust…
    $10.0 Superintelligent AI un-friendly…
    $15.0 Environmental global warming…
    $5.0 Other
    $0.0 Extraterrestrial invasion…
    $0.0 Simulation Shut Down

  94. Ike Andrews says:

    $40 Biological viruses…
    $05 Environmental global warming…
    $00 Extraterrestrial invasion…
    $30 Governments abusive power…
    $10 Nanotechnology gray goo…
    $05 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $10 Space Threats asteroids…
    $00 Superintelligent AI un-friendly…
    $00 Other

  95. Heidi in Denver says:

    Wow–I’m glad you guys don’t run the government! You’d trade freedom for an unlimited science budget rather than enabling human capacity to solve its own problems outside Central Planning (which most of you would like to run). Glad to see you working on this in the PRIVATE sector, though!

    I’d add space colonization and anti-terrorism efforts to the list since, assuming the goal is to keep humanity alive infinitely, colonization of space is a remedy for solar extinction/existential asteroid threat.

    Humanity’s greatest present danger is other humans with nihilistic belief systems, yet you ignore terrorism as a category.

    Within the current categories I vote:
    $50 Other (we don’t know what we don’t know…use it as venture capital and we might get lucky with solutions to yet-unseen problems–regardless, returns will keep the principal afloat until we know how to deploy it)
    $20 Biological viruses
    $10 Space Threats asteroids (I’d hope this buys space colonization technology)
    $10 Governments’ abusive power
    $05 Nuclear holocaust
    $05 Nanotechnology gray goo
    $0 Superintelligent AI un-friendly
    $0 Simulation Shut Down if we live in one
    $0 Extraterrestrial invasion
    $0 Environmental global warming (another ice age is more dangerous/likely, but we can’t control either, just as we’re not responsible for current Martian warming or the lifespan of our sun)

  96. crosspatch says:

    $50 Biological
    $25 Space
    $10 Governments
    $4 Nuclear
    $3 Environmental
    $2 Superintelligent AI
    $2 Nanotechlonogy
    $2 Other
    $1 Extraterrestrial
    $1 Simulation Shut Down

  97. Rich Rostrom says:

    $100M to study the threat posed by “global warming”. Note the quotes. Actual global warming (if it is even happening) poses no significant threat. But the panic over “global warming” threatens to cripple the world economy, block important technology, and send humanity into a path of permanent decline. Increasingly corrupt and despotic regimes will fight ever more destructively over the remnants of technology, while trashing the environment to extract the remaining natural resources. It could end with desperate survivors poisoning whole oceans with waste because they can’t afford even the slightest expense to clean up.

    Panic? A commenter above wrote “Runaway climate change also has the potential to kill all life on earth.” This is nonsense on stilts, but millions believe it. That’s panic.

  98. James M says:

    $35 Biological viruses…
    $10 Environmental global warming…
    $05 Extraterrestrial invasion…
    $05 Governments abusive power…
    $05 Nanotechnology gray goo…
    $25 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $00 Space Threats asteroids…
    $05 Superintelligent AI un-friendly…
    $05 Other

  99. Marc says:

    $30 Biological viruses…
    $20 Governments abusive power…
    $5 Nanotechnology gray goo…
    $10 Nuclear holocaust…
    $5 Simulation Shut Down if we live in one…
    $5 Space Threats asteroids…
    $5 Superintelligent AI un-friendly…
    $10 Non-state actor terrorists
    $5 Computer viruses
    $5 Combating mass poisoning/infectious agents

    $100 million total

  100. D says:

    $100mil?

    you’re joking right? That’s not really enough to do… anything. Except MAYBE print the materials and advertizing to tell people not to worry.

    The CDC’s budget alone is $9 billion per year… They spend $1.5 billion on the terrorist research arm per year, and that doesn’t cover everything…
    CDC Budget Detail 2006-07

    $100mil just doesn’t get you very far…

  101. JH says:

    $40 Biological viruses…
    $00 Environmental global warming…
    $00 Extraterrestrial invasion…
    $00 Governments abusive power…
    $10 Nanotechnology gray goo…
    $05 Nuclear holocaust…
    $00 Simulation Shut Down if we live in one…
    $05 Space Threats asteroids…
    $00 Superintelligent AI un-friendly…
    $40 Other

  102. David Lawler says:

    $30 Biological viruses…
    $5 Environmental global warming…
    $2 Extraterrestrial invasion…
    $3 Governments abusive power…
    $8 Nanotechnology gray goo…
    $20 Nuclear holocaust…
    $0 Simulation Shut Down if we live in one…
    $13 Space Threats asteroids…
    $8 Superintelligent AI un-friendly…
    $11 Other

  103. Bruce Klein says:

    100+ replies and 80 votes… thanks guys!

    my vote:

    70 — Superintelligent AI un-friendly…
    10 — Biological viruses…
    10 — Nanotechnology gray goo…
    4 — Nuclear holocaust…
    2 — Space Threats asteroids…
    1 — Environmental global warming…
    1 — Extraterrestrial invasion…
    1 — Governments abusive power…
    1 — Simulation Shut Down if we live in one…

  104. Christopher Moore says:

    Since the allocated $100 Million will be used to research the problem instead of fixing it, my list does not reflect what I feel to be the greatest dangers facing mankind — only the dangers in which a proper current threat assessment is lacking.

    My list also takes into consideration the possiblilty of an attainable remedy given any amount of funding, no matter how large. (No amount of funding can prevent an alien attack. Also, if this research is partially government funded, then no amount of money will expose or expulse its corrupted members — that is if the initial research is even allowed).

    Many of the available options could possibly fall under the heading of “governments’ abuse of powers,” (biological, technological, nuclear) and the research results could be corrupted if the threat is intentional.

    60 — Biological viruses
    10 — Cyber-terrorism
    10 — Nuclear holocaust
    5 — Environmental global warming
    5 — Space Threats asteroids
    5 — Nanotechnology gray goo Cyber-terrorism
    3 — Other (Pole Shift or increase in solar activity)…
    2 — Superintelligent AI un-friendly

    We should guard against lower-tech man-made threats first, as those are the best understood & most abused. Then, guard against natural phenomena currently beyond our control and high tech threats (not from abuse but from mis-use).

    Governments’ abuse of power would be at the top of my list, however that’s not only the most difficult to research & remedy, but it’s also impossible for some to take seriously. It is, by far, my highest personal concern given the current state of the world.

    If I was persoanlly given $100 million I would allocated the funds into the following 2 (non-listed) categories:

    50 — Find ways to realistically reduce the chaos & fear leading up to the Singularity. Many average people are confused and finding it hard to deal with the accelerating change.

    50 — Search for a replacement, clean, low-cost power source to replace oil.

    To cope with the coming whirlwind of changes, the economy, technology, society, our individual & collective mind will need a serious overhaul. Prioritize and get prepared locally before the global change. The Singularity will be a brick wall for some and a stariway to heaven for others. It’s our choice, and failing to be prepared is the absolute greatest existential danger facing mankind.

  105. Laurie Marquardt says:

    $12 Biological
    $75 Environmental
    $00 Extraterrestrial
    $00 Governments
    $13 Nanotechnology
    $00 Nuclear
    $00 Simulation Shut Down
    $00 Space Threats
    $00 Superintelligent AI
    $00 Other
    $100 million total

    I think that those are the most immediate threats, and so they should be taken care of ASAP.

  106. robomoon says:

    $- Biological viruses…
    $- Environmental global warming…
    $- Extraterrestrial invasion…
    $- Governments abusive power…
    $- Nanotechnology gray goo…
    $- Nuclear holocaust…
    $- Simulation Shut Down if we live in one…
    $- Space Threats asteroids…
    $- Superintelligent AI un-friendly…
    $100 Other: Better living standards to enable better strategies for family planning. Overpopulation of humans who made the very fast mass extinction of plants & animals during the 20st Century and these years of the 21st Century possible. Humans who destroy the ecology on this planet in waste quantities are an actual problem which must be taken care of now!

  107. Crispin says:

    $70 Governments abusive power… — politics is the perennial existential threat
    $8 Biological viruses… — monoculture and uniformity make this highly dangerous
    $8 Environmental global warming… — see above
    $8 Other — horizon scanning is important
    $4 Nuclear holocaust… — has the potential to occur
    $1 Space Threats asteroids… — not a lot we can do about it, but interesting
    $0.25 Nanotechnology gray goo… — highly unlikely
    $0.25 Superintelligent AI un-friendly… — highly unlikely
    $0.25 Extraterrestrial invasion… — very little forward notice we could get of this…
    $0.25 Simulation Shut Down if we live in one… — very poor sci-fi

  108. $33 Superintelligent AI — should be enough to fund SIAI indefinitely, train a team of FAI theorists, and support others such as Nick Hay and Rolf Nelson. Try to redirect people working on UFAI towards something else
    $23 Biological viruses
    $22 Nanotechnology — hire people currently working on MNT to do life extension research instead
    $16 Other — including general existential risks research (like Bostrom’s work), good ethics research, teaching people about rationality
    $2 Nuclear — health food, Yoga classes, tempur-pedic mattresses, and masseuses for the people in charge of launching missiles (reduce stress levels)
    $2 Governments — distract any politicians who are trying to hinder something we’re working on
    $1 Space Threats — mostly astronomy
    $1 Simulation Shut Down — not as silly as people think, and some good research might come out of it
    $0 Extraterrestrial invasion — this is as silly as people think
    $0 Environmental — lots of money is being spent on this already

  109. Let me revise the above. I had $23 left when I was done, so I threw it at the biological viruses. That was silly. Add the $23 to ‘Superintelligent AI’; SIAI could extend its talent search, perhaps by creating new courses at universities.

  110. Hydrogen Sulfide gas will Kill all people. Homo Sap will go
    EXTINCT unless drastic action is taken.

    October 2006 Scientific American

    “EARTH SCIENCE
    Impact from the Deep
    Strangling heat and gases emanating from the earth and sea, not
    asteroids, most likely caused several ancient mass extinctions.
    Could the same killer-greenhouse conditions build once again?
    By Peter D. Ward
    downloaded from:
    http://www.sciam.com/
    article.cfm?articleID=
    00037A5D-A938-150E–
    A93883414B7F0000&
    sc=I100322
    .….….….….…Most of the article omitted.….….….….…..
    But with atmospheric carbon climbing at an annual rate of 2 ppm
    and expected to accelerate to 3 ppm, levels could approach 900
    ppm by the end of the next century, and conditions that bring
    about the beginnings of ocean anoxia may be in place. How soon
    after that could there be a new greenhouse extinction? That is
    something our society should never find out.”

    Press Release
    Pennsylvania State University
    FOR IMMEDIATE RELEASE
    Monday, Nov. 3, 2003
    downloaded from:
    http://www.geosociety.org/meetings/2003/prPennStateKump.htm
    “In the end-Permian, as the levels of atmospheric oxygen fell and
    the levels of hydrogen sulfide and carbon dioxide rose, the upper
    levels of the oceans could have become rich in hydrogen sulfide
    catastrophically. This would kill most of the oceanic plants and
    animals. The hydrogen sulfide dispersing in the atmosphere would
    kill most terrestrial life.”

    http://www.astrobio.net is a NASA web zine. See:

    http://www.astrobio.net/
    news/modules.php?op=
    modload&name=News&
    file=article&sid=672

    http://www.astrobio.net/
    news/modules.php?op=
    modload&name=News&
    file=article&sid=1535

    http://www.astrobio.net/
    news/article2509.html

    http://astrobio.net/news/
    modules.php?op=modload
    &name=News&file=article
    &sid=2429&mode=thread
    &order=0&thold=0

    These articles agree with the first 2. They all say 6 degrees C or
    1000 parts per million CO2 is the extinction point.

    The global warming is already 1 degree Farenheit. 11 degrees
    Farenheit is about 6 degrees Celsius. The book “Six Degrees” by
    Mark Lynas agrees. If the global warming is 6 degrees
    centigrade, we humans go extinct. See:
    http://www.marklynas.org/
    2007/4/23/six-steps-to-hell–
    summary-of-six-degrees-as–
    published-in-the-guardian

    “Under a Green Sky” by Peter D. Ward, Ph.D., 2007.
    Paleontologist discusses mass extinctions of the past and the one
    we are doing to ourselves.

    ALL COAL FIRED POWER PLANTS MUST BE
    CONVERTED TO NUCLEAR IMMEDIATELY TO AVOID
    THE EXTINCTION OF US HUMANS. 32 countries have
    nuclear power plants. Only 9 have the bomb. The top 3
    producers of CO2 all have nuclear power plants, coal fired power
    plants and nuclear bombs. They are the USA, China and India.
    Reducing CO2 production by 90% by 2050 requires drastic action
    in the USA, China and India. King Coal has to be demoted to a
    commoner. Coal must be left in the earth. If you own any coal
    stock, NOW is the time to dump it, regardless of loss, because it
    will soon be worthless.
    $100 million to teach people that nuclear power is safe and the only thing that works.

  111. I could put these threats in any haphazard order and the list’s logic would be valid, but it still wouldn’t put humanity back squarely on the tracks. The number one existential threat is ignorance. Let’s put $100M into educating the population of planet and I’m sure we’ll then be able to face and solve any problem we are faced with.

  112. Frank Sudia says:

    $20 Biological viruses, ease of creation
    Not an existential risk, but not enough work being done and 30% of us could die is a few months.
    $20 Nuclear holocaust
    Nuclear winter IS an existential risk, and could actually happen tomorrow.
    $20 Space Threats, asteroids, gamma burst, etc.
    $20 Nanotechnology gray goo…
    $10 Other — earth swallowing micro black holes from new particle accelerators. Maybe this is why SETI finds nothing.
    $10 Extraterrestrial invasion…
    Might be worth taking a sober look at our survival options.

    $0 Superintelligent AI un-friendly…
    This is a legal and political issue, not a technology problem. AIShield risks being total Luddite obstructionism! (Work on curbing Microsoft.)
    $0 Environmental global warming…
    Not an exsitential risk, we’ll survive
    $0 Governments abusive power
    Not an existential risk, we’ll survive
    $0 Simulation Shut Down if we live in one…
    You won’t care if the simulation ends, will you?

    Post the runner-up ideas in a second ranked list near the top, so future commenters can review and be inspired by them, not just your chosen few categories.

  113. Press to Digitate says:

    $20M Superintelligent AI un-friendly…
    $20M Nanotechnology gray goo…
    These threats deserve special consideration out of respect for their sheer inevitability. We *KNOW* that they will occur, that they will occur within the next 20 years, and that it is only a question of where and when. There are too many nucleation sites from which Strong AI or Carbon-fixing Nanoreplicators might emerge, and too many ‘good’ motivations prompting them to be developed for well-intentioned purposes. Every year that passes, the tools and enabling technologies for each become cheaper, more powerful and more readily available. These are not occasional rarities like a Big Space Rock or Coronal Mass Ejection; these will happen ONCE, and then, Its All Over — unless we can somehow prepare ahead of time.

    $10M Biological viruses…
    While the technology for genetic manipulation and synthetic life also becomes exponentially cheaper, more powerful and more readily available year by year, and it is likely also inevitable that someone will create a virulent new synthetic pathogenic organism that defies common biological defenses, in this case the tools to counter it also grow more potent with the passage of time. We live in an age when the DNA of any new organism can be sequenced in a matter of hours, a cure modelled by computer within days, and industrial quantities synthesized in biorefineries within a month. Nevertheless, its inevitability places it high on the list of priorities.

    $10M Space Threats asteroids…
    The Big Space Rock has to be on the list because its happend before, and will happen again. We have many ways of dealing with the problem should it occur, if we are prepared, adn a large enough detection effort finds it in time.

    $10M Environmental global warming…
    While the technologies to solve the Environmental Energy issue already exist, if they are not disseminated and implemented in time on a broad enough scale, soon enough, the Clathrate Gun (Methane released from the melting permafrost) will render the planet uninhabitable within the century. 4,000 Billion Tones of Methane (>20x as potent a GHG as CO2) will dwarf anything man can produce as a driver of Climate Change; we must mine the atmosphere for the Carbon we need. Atmospheric Carbon Capture coupled with Algae Based Biofuels can fix this problem.

    $10M Extraterrestrial invasion…
    The million-plus annual UFO sightings reported globally, hundreds of thousands of abductees and hundreds of physical trace cases are absolutely convincing that “They” are real, and they are Here. Even the scientific mainstream has been forced to recognize that planets, with water, and conditions suitable for life are not only ‘not rare’, but are probably quite common in the universe. However, the apparent evidence indicates that this threat has already happend, but with no — or very subtle — negative effects from it. Even so, it is worthy of serious research and analysis, sooner rather than later.

    $05M Governments abusive power…
    The 2008 FISA revision failed to insert the Constitution between our government and our personal electronics, which are about to become far more ‘personal’, with the advent of Brain/Computer Interface technologies. Neurochips are a reality today, with more than 200,000 already implanted for Cochlear hearing, artificial retinas, and brain pacemakers for Parkinsonism, Epilepsy, OCD, Obesity, and Depression. Microwave-based Voice-to-Skull technology (“Government Mind-Control Rays”) are now also an acknowledged reality. Orwell was right, just 30 years or so off in his timing.

    $05M Nuclear holocaust…
    This ranks low becuase of its extremely low probability at the Existential Risk level. Israel/Iran or India/Pakistan nuclear exchanges would not pose such a risk, though they would be damned inconvenient for the participants, and neighboring bystanders in the region. However, better dissemination of Anti-Ballistic Missile technology and methods for screening ships and containers at sea for onboard WMD could substantially eliminate this threat category.

    $00M Simulation Shut Down if we live in one…
    This should not be on the list, not because its impossible, but because, even if true, there is nothing that money could be spent on which would make any difference. What would you do? Build an electronic “prayer machine”, in hopes of contacting the Simulators directly, through the vacuum aether? If we live in a simulation, that aether itself is suspect, and they are reading these texts as we type them anyway.

    $10M Other
    (1) A Temporal Modem is presently under construction at the University of Connecticut by Dr. Ronald Mallett. Unless the prevailing models of Relativity and Quantum Mechanics are fundamentally WRONG, this device cannot fail to work. It will become cheap to replicate; the knowledge to do so is already in the public domain. It has the potential to destroy causality and trigger chronoplexy on a planetary scale. This cannot be stopped — even if Mallett were inhibited by whatever means, the design is already widely distributed, and some grad student, sooner or later, will inevitably build one that works. So, the best we can do, as with Strong AI and Grey Goo and Synthetic Plagues, is to prepare to somehow detect and deal with its consequences.

    (2) Just because the Relativistic Heavy Ion Collider didnt create blackholes, strangelets, or vacuum instability does not mean the Large Hadron Collider will not. Or the next biggest machine after that, or the next one, etc. Sooner or later, one of these Big Science Machines is going to produce something unexpected, at energies where unexpected most likely equals dangerous. Thus far, no research has been devoted to mitigation strategies, techniques, and technologies, which is odd given the very high probability of something eventually going wrong.

    (3) When (not ‘If’) room temperature superconductors are commercially introduced, one unexpected result will be the rapid development of a Thanatronic Interface; a device which perfects Electronic Voice Phenomena, enabling reliable, high-fidelity Instrumental Transcommunication with the “Dead”. In every other instance in scientific equipment where a Germanium Diode (used as a detector of subtle EM fields) has been replaced with a Superconducting Diode, a sensitivity increase and improvement of signal-to-noise ratio of Three Orders of Magnitude has been observed. There is no reason to think that EVP detection will be any different. While this may not pose an existential threat, the inevitable advent of reliable electronic communication with the dead will certainly change human society as profoundly as anything else one can imagine. It is worth serious study of the implications and consequences of its development.

    Of course, if a wavefront from the Gamma Ray Burst of Wolf-Rayet 104 strikes Earth in December, 2012, as is indicated, we may have bigger and more immediate problems to contend with than much of the above.

  114. Bon Davis says:

    I will go out on a different limb here. I would allocate $50 million to unfriendly AI. Most of the space born threats, well there isn’t really much we could honestly do. Especially near-term. I agree with earlier comments: space invasion, government abuse and simulation shutdown don’t deserve to be mentioned (how do you fund against a government abuse of power?). AI is a more imminant threat and different from the others: it is the one technology that could actively turn against us, pose a threat by indifference (not actually hostile, see Yudkowski), and the others actually take a human to push the button or open the vile. Still — $20 million to nanotechnology, $10 million to nuclear, $10 million to biological (both of these life will persist without help, if even limited)amd $10 million to environmental threats.

  115. A.H. Jessup says:

    There are two major vectors–energy and water–which define the complexities of most of these symptoms.
    The social inversions that lead to genocide and government abuse tend to go away where a sufficiency of these two elements is available, combined with enough education to use them positively.

    Global warming is an energy dysfunction; most of the political complexities are water-driven and energy-driven. Virus control is a lot easier, as wlel, when these are resolved.

    The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.

  116. DPirate says:

    Our only existential threat is humanity. Soon enough, what little there is to eat will be full of poisons. Maybe some elite cadre will be able to consolidate enough power to save us from ourselves, but who wants to live like that. Better the whole thing comes down and we start over in a few thousand years. False start.

  117. DPirate says:

    “The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.”

    Sounds like Lenin, lol.

  118. We can be sure that natural hazards are insignificant because humankind has survived 200,000 years of exposure. By contrast, we’ve had only about 60 years of adaptation to man-made existential threats, and new ones appear continually.

    Nuclear holocaust does not threaten extinction because it would happen in the Northern Hemisphere. New Zealand’s South Island and Tierra del Fuego are insulated from its effects by at least one complete Hadley cell of atmospheric circulation.

    The biggest risks are uncategorized surprises. For example, a mad billionaire (trillionaire?) ‘hears’ orders from God to exterminate the human race. Or pollutants in the ocean produce mutant phytoplankton that emit poison gas.