Read some inspirational quotes!
9/11 Commission
an independent, bipartisan commission created by congressional legislation and the signature of President George W. Bush in late 2002. It is chartered to prepare a full and complete account of the circumstances surrounding the September 11, 2001 terrorist attacks, including preparedness for and the immediate response to the attacks.
Frank W. Abagnale
was the master criminal whose autobiography Catch Me If You Can was turned into a film by Steven Spielberg starring Leonardo DiCaprio and Tom Hanks.
“It is important to remember that technology breeds crime, it always has
… it always will.”
Jamal Ahmidan
was a principal behind the 3/11 attacks in Spain that left 2000 dead or injured and Spain with a new government that retreated out of Iraq.
Dale Amon
writes for the Samizdata blog.
Abdullah Ahmad Badawi
Prime Minster of Malaysia and Chairman of the 57-nation Organization of the Islamic Conference.
Jeff Bezos
founder, president, chief executive officer (CEO), and chairman of the board of Amazon.com.
Scott Borg
Director and Chief Economist of the U.S. Cyber Consequences Unit, a Department of Homeland Security advisory group and also a member of our Scientific Advisory Board.
Nick Bostrom
winner of a Templeton Foundation grant, cofounder of The World Transhumanist Association, and was director of the Future of Humanity Institute at the University of Oxford.
“The technology to produce a destructive nanobot seems considerably easier to develop than the technology to create an effective defense against such an attack (a global nanotech immune system, an ‘active shield’).”
“Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach — see what happens, limit damages, and learn from experience — is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions.”
“The Fermi Paradox refers to the question mark that hovers over the data point that we have seen no signs of extraterrestrial life. This tells us that it is not the case that life evolves on a significant fraction of Earth-like planets and proceeds to develop advanced technology, using it to colonize the universe in ways that would have been detected with our current instrumentation. There must be (at least) one Great Filter — an evolutionary step that is extremely improbable — somewhere on the line between Earth-like planet and colonizing-in-detectable-ways civilization. If the Great Filter isn’t in our past, we must fear it in our (near) future. Maybe nearly every civilization that develops a certain level of technology causes its own extinction.”
BT (British Telecommunications) white paper
“It is clear from this section that we are rapidly inventing new ways of destroying ourselves, and that the risk to mankind is increasing exponentially.”
“As early as 2005, there will be a “deliberate biotech self-destruct by a malicious biotech researcher” and “terrorism will rise beyond the capability of government systems.”
Joe Buff
best selling author of Straits Of Power , Tidal Rip , Crush Depth , Thunder in the Deep , and Deep Sound Channel . He is a regular columnist for military.com and is the winner of the 1999 and 2000 Literary Awards from the Naval Submarine League.
Warren Buffett
our 2002 Guardian Award winner, is the world’s second wealthiest man, who is known as the ‘Oracle of Omaha’ for his astute investments.
“Predicting rain doesn’t count, building arks does.”
“Fear may recede with time, but the danger won’t — the war against terrorism can never be won.”
William E. Burrows
cofounder of the Alliance to Rescue Civilization (ARC) (which was merged into the Lifeboat Foundation) and a member of our Advisory Board until his death.
George W. Bush
U.S. President.
“Our generation faces new and grave threats to liberty, to the safety of our people and to civilization itself. We face an aggressive force that glorifies death, that targets the innocent and seeks the means to murder on a massive scale.”
“Wishful thinking might bring comfort, but not security.”
“The gravest danger… lies at the perilous crossroads of radicalism and technology.”
Charles M. Chafer
involved in launching rockets for the past 25 years, is cofounder of Celestis, Inc., and is a member of our Scientific Advisory Board.
Arthur C. Clarke
prophetic SF author who in 1945 predicted a world linked by geostationary satellites.
Michael Crichton
was author of The Andromeda Strain, Jurassic Park, and Prey. He was also the creator of the television series ER.
“Sometime in the twenty-first century, our self-deluded recklessness will collide with our growing technological power. One area where this will occur is in the meeting point of nanotechnology, biotechnology, and computer technology. What all three have in common is the ability to release self-replicating entities into the environment.”
“Nobody does anything until it’s too late. We put the stoplight at the intersection after the kid is killed.”
“‘They didn’t understand what they were doing.’ I’m afraid that will be on the tombstone of the human race.”
DEBKAfile’s
War Diary is included in the US Library of Congress historic collection of 2003 War on Iraq on Internet. This online news source contains in-depth coverage of terrorism, security, political analysis, and espionage and is available in English and Hebrew.
Retired Army General Wayne A. Downing
was U.S. President George W. Bush’s deputy national security adviser for counterterrorism until July 8, 2002.
“The United States may have to declare martial law someday in the case of a devastating attack with weapons of mass destruction causing tens of thousands of casualties.”
“Most sobering to me was [the terrorists] research on chemical weapons, radiological dispersion devices, and their fascination with nuclear weapons. They are obsessed with them.”
Eric Drexler
founder of the Foresight Institute, and founder of the nanotechnology movement.
“Foresight’s concern for the long-term potential abuse of nanotechnology has been confirmed and strengthened. Those who abuse technology — from airliners to anthrax — for destructive ends do exist and are unlikely to stop before full nanotech arrives, with all its power for both good and ill.”
“It would be easy to say, ‘let government or industry figure out how to prevent nanotech misuse,’ but the events of Sept. 11 and afterwards show this to be naive. (The current attempt to make airliners safer by keeping all sharp objects off the plane is laughable — a pair of glass eyeglasses is easily broken and used instead. The authorities dealing with the anthrax attacks expressed surprise that anthrax could leak from “sealed” envelopes — when anyone who’s ever licked one can see that the adhesive doesn’t extend to the flap’s edges.) Outside perhaps the military, government doesn’t do too well at anticipating emergencies and planning policies for them — their incentives are too political, and their time horizons are too short…”
“If extraterrestrial civilizations exist, and if even a small fraction were to behave as all life on Earth does, then they should by now have spread across space.”
“By now, after hundreds of millions of years, even widely scattered civilizations would have spread far enough to meet each other, dividing all of space among them.”
“An advanced civilization pushing its ecological limits would, almost by definition, not waste both matter and energy. Yet we see such waste in all directions, as far as we can see spiral galaxies: their spiral arms hold dust clouds made of wasted matter, backlit by wasted starlight… The idea that humanity is alone in the visible universe is consistent with what we see in the sky… Thus for now, and perhaps forever, we can make plans for our future without concern for limits imposed by other civilizations.”
Robert A. Freitas Jr.
was research scientist at Zyvex LLC, the Earth’s first molecular nanotechnology company and is the author of Nanomedicine , the first book-length technical discussion of the medical applications of nanotechnology and medical nanorobotics. He is a 2006 Lifeboat Foundation Guardian Award winner and a member of our Scientific Advisory Board.
“Specific public policy recommendations suggested by the results of the present analysis include:
An immediate international moratorium on all artificial life experiments implemented as nonbiological hardware. In this context, ‘artificial life’ is defined as autonomous foraging replicators, excluding purely biological implementations (already covered by NIH guidelines tacitly accepted worldwide) and also excluding software simulations which are essential preparatory work and should continue. Alternative ‘inherently safe’ replication strategies such as the broadcast architecture are already well-known.”
Bill First
U.S. Senate Majority Leader.
“Like everyone else, politicians tend to look away from danger, to hope for the best, and pray that disaster will not arrive on their watch even as they sleep through it. This is so much a part of human nature that it often goes unchallenged.
But we will not be able to sleep through what is likely coming soon — a front of unchecked and virulent epidemics, the potential of which should rise above your every other concern. For what the world now faces, it has not seen even in the most harrowing episodes of the Middle Ages or the great wars of the last century…
No intelligence agency, no matter how astute, and no military, no matter how powerful and dedicated, can assure that a few technicians of middling skill using a few thousand dollars worth of readily available equipment in a small and apparently innocuous setting cannot mount a first-order biological attack.
It’s possible today to synthesize virulent pathogens from scratch, or to engineer and manufacture prions that, introduced undetectably over time into a nation’s food supply, would after a long delay afflict millions with a terrible and often fatal disease. It’s a new world…
So what must we do?
I propose an unprecedented effort — a “Manhattan Project for the 21st Century” — not with the goal of creating a destructive new weapon, but to defend against destruction wreaked by infectious disease and biological weapons…
This is a bold vision. But it is the kind of thing that, once accomplished, is done. And it is the kind of thing that calls out to be done — and that, if not done, will indict us forever in the eyes of history.
In diverting a portion of our vast resources to protect nothing less than our lives, the lives of our children, and the life of our civilization, many benefits other than survival would follow in train — not least the satisfaction of having done right.”
Bill Gates
cofounder of Microsoft, which became the world’s largest PC software company. He is also the 2015 Lifeboat Foundation Guardian Award winner.
“This is like earthquakes, you should think in order of magnitudes. If you can kill 10 people that’s a 1, 100 people that’s a two… Bioterrorism is the thing that can give you not just sixes, but sevens, eights, and nines.
With nuclear war, once you have got a six, or a seven, or eight, you’d think it would probably stop. With bioterrorism it’s just unbounded if you are not there to stop the spread of it.”
Rudolph Giuliani
mayor of New York City when it was attacked on 9/11.
“The most dangerous situation is where you’re facing peril but you’re not aware of it.”
Alan H. Goldstein
Professor of Biomaterials, Fierer Chair of Molecular Cell Biology, and Biomedical Materials Engineering and Science Program Chair at Alfred University and is a member of our Scientific Advisory Board.
“…because of nanobiotechnology, we have never been closer to a Grey Goo scenario.”
NASA Administrator Michael D. Griffin
Julian Haight
president of SpamCop.net, the premier spam reporting service.
“I’m getting bombed off the face of the Earth and no one cares.”
Stephen Hawking
was a famous cosmologist who discovered that black holes are not completely black, but emit radiation and eventually evaporate and disappear.
“In the long term, I am more worried about biology. Nuclear weapons need large facilities, but genetic engineering can be done in a small lab. You can’t regulate every lab in the world. The danger is that either by accident or design, we create a virus that destroys us.”
“I don’t think the human race will survive the next thousand years, unless we spread into space. There are too many accidents that can befall life on a single planet.”
Robert A. Heinlein
was an influential and controversial science fiction author. The English language absorbed several words from his fiction, including “grok”, meaning “to understand so thoroughly that the observer becomes part of the observed.”
“The Earth is just too small and fragile a basket for the human race to keep all its eggs in.”
Dr. Barbara Marx Hubbard
author, public speaker, social innovator and President of the Foundation for Conscious Evolution. She was also a member of our Advisory Board until her death.
Admiral David E. Jeremiah
US Navy (Ret.), Former Vice Chairman, Joint Chiefs of Staff
Bill Joy
“Edison of the Internet” is inventor of the Unix word processor vi, cofounder of Sun Microsystems, and a 2006 Lifeboat Foundation Guardian Award winner.
“We are being propelled into this new century with no plan, no control, no brakes.”
“But many other people who know about the dangers still seem strangely silent. When pressed, they trot out the ‘this is nothing new’ riposte — as if awareness of what could happen is response enough.”
“I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.”
“An immediate consequence of the Faustian bargain in obtaining the great power of nanotechnology is that we run a grave risk – the risk that we might destroy the biosphere on which all life depends.”
“…if our own extinction is a likely, or even possible, outcome of our technological development, shouldn’t we proceed with great caution?”
Michio Kaku
co-creator of string field theory.
“Of all the generations of humans that have walked the surface of the Earth — for 100,000 years, going back when we first left Africa — the generation now alive is the most important.
The generation now alive, the generation that you see, looking around you, for the first time in history, is the generation that controls the destiny of the planet itself.”
Garry Kasparov
chairman of the United Civil Front, a democratic activist group based in Russia. He was the world chess champion for over 20 years.
Mickey Kaus
author of the blog Kausfiles published in Microsoft’s Slate magazine, authored the book The End of Equality .
Ed Koch
former Mayor of New York City.
Charles Krauthammer
was a syndicated columnist who appeared in the Washington Post and other publications and was a commentator on various TV programs. He earned his M.D. from Harvard University’s medical school in 1975 and won the Pulitzer Prize in 1987.
“Resurrection of the [1918 Flu] virus and publication of its structure open the gates of hell. Anybody, bad guys included, can now create it. Biological knowledge is far easier to acquire for Osama bin Laden and friends than nuclear knowledge.
And if you can’t make this stuff yourself, you can simply order up DNA sequences from commercial laboratories around the world that will make it and ship it to you on demand… And if the bad guys can’t make the flu themselves, they could try to steal it. That’s not easy. But the incentive to do so from a secure facility could not be greater. Nature, which published the full genome sequence, cites Rutgers bacteriologist Richard Ebright as warning that there is a significant risk “verging on inevitability” of accidental release into the human population or of theft by a ‘disgruntled, disturbed or extremist laboratory employee.’
Why try to steal loose nukes in Russia? A nuke can only destroy a city. The flu virus, properly evolved, is potentially a destroyer of civilizations.
We might have just given it to our enemies.
Have a nice day.”
Ray Kurzweil
prophetic author of the 1990 book The Age of Intelligent Machines where he correctly predicted advancements in AI. He was also the principal developer of the first omni-font optical character recognition, the first print-to-speech reading machine for the blind, the first CCD flat-bed scanner, and the first commercially marketed large-vocabulary speech recognition. He is a member of the U.S. Army Science Advisory Group, our 2005 Guardian Award winner, and is on our Scientific Advisory Board.
“A self-replicating pathogen, whether biological or nanotechnology based, could destroy our civilization in a matter of days or weeks.”
“We can envision a more insidious possibility. In a two-phased attack, the nanobots take several weeks to spread throughout the biomass but use up an insignificant portion of the carbon atoms, say one out of every thousand trillion (1015). At this extremely low level of concentration, the nanobots would be as stealthy as possible. Then, at an ‘optimal’ point, the second phase would begin with the seed nanobots expanding rapidly in place to destroy the biomass. For each seed nanobot to multiply itself a thousand trillionfold would require only about 50 binary replications, or about 90 minutes.”
“Recall that biological evolution is measured in millions and billions of years. So if there are other civilizations out there, they would be spread out in terms of development by huge spans of time. The SETI assumption implies that there should be billions of ETIs (among all the galaxies), so there should be billions that lie far ahead of us in their technological progress. Yet it takes only a few centuries at most from the advent of computation for such civilizations to expand outward at at least light speed. Given this, how can it be that we have not noticed them? The conclusion I reach is that it is likely (although not certain) that there are no such other civilizations.”
“To this day, I remain convinced of this basic philosophy: no matter what quandries we face — business problems, health issues, relationship difficulties, as well as the great scientific, social, and cultural challenges of our time — there is an idea that can enable us to prevail. Furthermore, we can find that idea. And when we find it, we need to implement it. My life has been shaped by this imperative. The power of an idea — this is itself an idea.”
John Leslie
author of The End of the World: The Science and Ethics of Human Extinction and a member of our Scientific Advisory Board.
“Our failure to detect intelligent extraterrestrials may indicate not so much how rarely these have evolved, but rather how rapidly they have destroyed themselves after developing technological civilizations.”
“What is surprising is that so little has been done to develop Earth-based artificial biospheres… If one-hundredth as much had been spent on developing artificial biospheres as on making nuclear weapons, a lengthy future for humankind might by now be virtually assured.”
Ken Livingstone
mayor of London, said the following after Al Qaeda attacked Spain (and before they attacked London in the worst attack on London since World War II).
András Lörincz
Head Senior Researcher, Neural Information Processing Group, Eötvös Loránd University Budapest, Hungary is a member of our Scientific Advisory Board.
“I subsign the following opinion:
The future and well-being of the Nation depend on the effective integration of Information Technologies into its various enterprises, and social fabric.
Information Technologies are designed, used, and have consequences in a number of social, economic, legal, ethical, and cultural contexts. With the rise of unprecedented new technologies … and their increasing ubiquity in our social and economic lives, large-scale social, economic, and scientific transformations are predicted.
While these transformations are expected to be positive … there is general agreement among leading researchers that we have insufficient scientific understanding of the actual scope and trajectory of these socio-technical transformations.”
Richard G. Lugar
United States Senator for the state of Indiana. He is also the U.S. Senate Foreign Relations Committee Chairman.
Kelvin G. Lynn
Director of Center for Materials Research at Washington State University. Dr. Lynn has developed an “antimatter trap” that the U.S. Air Force is considering as the basis of an antimatter bomb which would be over 1,000 times as powerful as an H-bomb.
“I think we need to get off this planet, because I’m afraid we’re going to destroy it.”
John Robert Marlow
author of Nanosecurity and the Future (if Any) , and a member of our Advisory Board.
MIT Technology Review
Elon Musk
often likened to a real-life Tony Stark from Marvel’s Iron Man comics for his role in cutting-edge companies including SpaceX, a private space exploration company that holds the first private contracts from NASA for resupply of the International Space Station, and the electric car company Tesla Motors. Watch Elon in Iron Man 2! He is winner of the 2014 Guardian Award.
Sooner or later, we must expand life beyond this green and blue ball — or go extinct.
Jonathan Nolan
cowrote The Dark Knight , The Dark Knight Rises , and Interstellar . He is co-creator of Westworld and creator of Person of Interest .
Peggy Noonan
contributing editor of The Wall Street Journal and author of A Heart, a Cross, and a Flag .
David R. Obey
U.S. House of Representatives (Democrat – Wisconsin).
Tara O’Toole
physician and director of the Center for Biosecurity at the University of Pittsburgh Medical Center.
Ian Pearson
in-house futurologist for Futurizon, and advisor on our Scientific Advisory Board. (Being one of the first to join!)
“In 1900 there were only a few ways for the planet to be wiped out: comet, disease etc. But in the last few decades we have amassed a whole plethora of possibilities: nuclear, environmental, biological, and a lot of future threats will come from computing.”
“We’ve managed to get ourselves into a position where the statistical chances of extinction will soon exceed one percent [per year]. It means that sometime in the next 100 years the human race will be wiped out somehow.”
“Given this and the rate of technological advancement, I think the human race could be extinct within the next 30 to 40 years.”
Chris Phoenix
cofounder of the Center for Responsible Nanotechnology.
James P. Pinkerton
fellow at the New America Foundation, a columnist for Newsday and TechCentralStation.com and a contributor to the Fox News Channel. He authored What Comes Next: The End of Big Government and the New Paradigm Ahead and is a member of our Advisory Board.
“But the continuing advance of technology has brought a new dilemma: Increasingly, any single individual or small group can wield great destructive power. If one were to draw a line over the course of history, from the first tomahawk, through the invention of gunpowder, all the way to the A-bomb, one would see a steeply upsloping curve.”
“Thanks to computers, that upslope is likely to stay steep for a long time to come, as artificial brain power doubles and redoubles. Techno-progress will be spread out across the full spectrum of human activity, but if history is any guide, then much ‘progress’ will come in the form of more lethal weapons, including nano-weapons. Thus, the ‘suitcase nuke’ that we fear today could be superseded by future mass-killers that fit inside a thimble — or a single strand of DNA.”
Baron Martin Rees
Royal Society Professor at Cambridge University, a Fellow of Kings College, and the U.K.’s Astronomer Royal. The winner of the 2001 Cosmology Prize of the Peter Gruber Foundation and our 2004 Guardian Award, he has published numerous academic papers and books including Our Final Hour: A Scientist’s Warning: How Terror, Error, and Environmental Disaster Threaten Humankind’s Future In This Century — On Earth and Beyond . He is a member of our Scientific Advisory Board.
“Science is advancing faster than ever, and on a broader front… But there is a dark side: new science can have unintended consequences; it empowers individuals to perpetrate acts of megaterror; even innocent errors could be catastrophic. The ‘downside’ from twenty-first century technology could be graver and more intractable than the threat of nuclear devastation that we have faced for decades.”
“If there were millions of independent fingers on the button of a Doomsday machine, then one person’s act of irrationality, or even one person’s error, could do us all in.”
“Even if all the world’s scientific academics agreed that some specific lines of inquiry had a disquieting ‘downside’ and all countries, in unison, imposed a formal prohibition, then how effectively could it be enforced? An international moratorium could certainly slow down particular lines of research, even if they couldn’t be stopped completely. When experiments are disallowed for ethical reasons, enforcement with ninety-nine percent effectiveness, or even just ninety percent, is far better than having no prohibition at all; but when experiments are exceedingly risky, enforcement would need to be close to one hundred percent effective to be reassuring: even one release of a lethal virus could be catastrophic, as could a nanotechnology disaster.
Despite all the efforts of law enforcers, millions of people use illicit drugs; thousands peddle them. In view of the failure to control drug smuggling or homicides, it is unrealistic to expect that when the genie is out of the bottle, we can ever be fully secure against bioerror and bioterror: risk would still remain that could not be eliminated except by measures that are themselves unpalatable, such as intrusive universal surveillance.”
“It is not inconceivable that physics could be dangerous too. Some experiments are designed to generate conditions more extreme than ever occur naturally. Nobody then knows exactly what will happen. Indeed, there would be no point in doing any experiments if their outcomes could be fully predicted in advance. Some theorists have conjectured that certain types of experiment could conceivably unleash a runaway process that destroyed not just us but Earth itself.”
“More ominously, there could be a crucial hurdle at our own present evolutionary stage, the state when intelligent life starts to develop technology. If so, the future development of life depends on whether humans survive this phase.”
“Suppose that we had a fateful decision that would determine whether the species might soon be extinguished, or else whether it would survive almost indefinitely. For instance, this might be the choice of whether to foster the first community away from Earth, which, once established, would spawn so many others that one would be guaranteed to survive.”
“Even a few pioneering groups, living independently of Earth, would offer a safeguard against the worst possible disaster — the foreclosure of intelligent life’s future through the extinction of all humankind.
The ever-present slight risk of a global catastrophe with a ‘natural’ cause will be greatly augmented by the risks stemming from twenty-first-century technology. Humankind will remain vulnerable so long as it stays confined here on Earth. Is it worth insuring against not just natural disasters by the probably much larger (and certainly growing) risk of human-induced catastrophes? Once self-sustaining communities exist away from Earth — on the Moon, on Mars, or freely floating in space — our species would be invulnerable to even the worst global disasters.”
“Once the threshold is crossed when there is a self-sustaining level of life in space, then life’s long-range future will be secure irrespective of any of the risks on Earth. Will this happen before our technical civilization disintegrates, leaving this as a might-have-been? Will the self-sustaining space communities be established before a catastrophe sets back the prospect of any such enterprise, perhaps foreclosing it forever? We live at what could be a defining moment for the cosmos, not just for our Earth.”
“What happens here on Earth, in this century, could conceivably make the difference between a near eternity filled with ever more complex and subtle forms of life and one filled with nothing but base matter.”
John Reid
Home Secretary for the United Kingdom.
“We are probably in the most sustained period of severe threat since the end of World War II.
While I am confident that the security services and police will deliver 100% effort and 100% dedication, they cannot guarantee 100% success.
Our security forces and the apparatus of the state provide a very necessary condition for defeating terrorism but can never be sufficient to do so on their own. Our common security will only be assured by a common effort from all sections of society.”
Adeo Ressi
Founding Executive Partner of Sophos Partners, LLC.
“There’s just no way to guarantee human survival unless we move off this planet.”
Glenn Reynolds
contributing editor of Tech Central Station where his special feature on technology and public policy called “Reynolds’ Wrap” appears each week. He is also the creator of the popular blog Instapundit and author of An Army of Davids : How Markets and Technology Empower Ordinary People to Beat Big Media, Big Government, and Other Goliaths. He is a member of our Advisory Board.
“Over the long term, by which I mean the next century, not the next millennium, disaster may hold the edge over prevention: a nasty biological agent only has to get out once to devastate humanity, no matter how many times other such agents were contained previously.
In the short term, prevention and defense strategies make sense. But such strategies take you only so far. As Robert Heinlein once said, Earth is too fragile a basket to hold all of our eggs. We need to diversify, to create more baskets. Colonies on the moon, on Mars, in orbit, perhaps on asteroids and beyond…”
Condoleezza Rice
U.S. Secretary of State.
Tom Ridge
first U.S. Homeland Security Director.
“The general theme of it’s-not-a-matter-of-if-but-when is legitimate.”
Donald H. Rumsfeld
was U.S. Secretary of Defense.
Carl Sagan
American astronomer, planetologist, biologist, and popularizer of science and space research.
Marshall T. Savage
author of The Millennial Project: Colonizing The Galaxy In Eight Easy Steps .
“Perhaps advanced civilizations don’t use radio, or radar, or microwaves. Advanced technology can be invoked as an explanation for the absence of extra terrestrial radio signals. But is seems unlikely that their technology would leave no imprint anywhere in the electromagnetic spectrum. We have been compared to the aborigine who remains blissfully unaware of the storm of radio and TV saturating the airwaves around him. Presumably, the aliens use advanced means of communications which we cannot detect. What these means might be is, by definition, unknown, but they must be extremely exotic. We don’t detect K2 signals in the form of laser pulses, gamma rays, cosmic rays, or even neutrinos. Therefore the aliens must use some system we haven’t even imagined.
This argument, appealing though it is, cannot survive contact with Occam’s razor — in this case Occam’s machete. The evidence in hand is simply nothing — no signals. To explain the absence of signals in the presence of aliens, demands recourse to what is essentially magic. Unfortunately, the iron laws of logic demand that we reject such wishful thinking in favor of the simplest explanation which fits the data: No signals; no aliens.
The skies are thunderous in their silence; the Moon eloquent in its blankness; the aliens are conclusive in their absence. The extraterrestrials aren’t here. They’ve never been here. They’re never coming here. They aren’t coming because they don’t exist. We are alone.”
“Now is the watershed of Cosmic history. We stand at the threshold of the New Millennium. Behind us yawn the chasms of the primordial past, when this universe was a dead and silent place; before us rise the broad sunlit uplands of a living cosmos. In the next few galactic seconds, the fate of the universe will be decided. Life — the ultimate experiment — will either explode into space and engulf the star-clouds in a fire storm of children, trees, and butterfly wings; or Life will fail, fizzle, and gutter out, leaving the universe shrouded forever in impenetrable blankness, devoid of hope.
Teetering here on the fulcrum of destiny stands our own bemused species. The future of the universe hinges on what we do next. If we take up the sacred fire, and stride forth into space as the torchbearers of Life, this universe will be aborning. If we carry the green fire-brand from star to star, and ignite around each a conflagration of vitality, we can trigger a Universal metamorphosis. Because of us, the barren dusts of a million billion worlds will coil up into the pulsing magic forms of animate matter. Because of us, landscapes of radiation blasted waste, will be miraculously transmuted: Slag will become soil, grass will sprout, flowers will bloom, and forests will spring up in once sterile places. Ice, hard as iron, will melt and trickle into pools where starfish, anemones, and seashells dwell — a whole frozen universe will thaw and transmogrify, from howling desolation to blossoming paradise. Dust into Life; the very alchemy of God.
If we deny our awesome challenge; turn our backs on the living universe, and forsake our cosmic destiny, we will commit a crime of unutterable magnitude. Mankind alone has the power to carry out this fundamental change in the universe. Our failure would lead to consequences unthinkable. This is perhaps the first and only chance the universe will ever have to awaken from its long night and live. We are the caretakers of this delicate spark of Life. To let it flicker and die through ignorance, neglect, or lack of imagination is a horror too great to contemplate.”
Robert J. Sawyer
“the dean of Canadian science fiction” and consultant for the Canadian Federal Government’s Department of Justice to discuss what Canadian law should be in relation to biotechnology, stem-cell research, cloning, and the privacy of personal genetic information. He is a member of our Advisory Board.
“There’s a long-standing problem in astronomy called the Fermi Paradox, named for physicist Enrico Fermi who first proposed it in 1950. If the universe should be teeming with life, asked Fermi, then where are all the aliens? The question is even more vexing today: SETI, the search for extraterrestrial intelligence with radio telescopes, has utterly failed to turn up any sign of alien life forms. Why?
One chillingly likely possibility is that, as the ability to wreak damage on a grand scale becomes more readily available to individuals, soon enough just one malcontent, or one lunatic, will be able to destroy an entire world. Perhaps countless alien civilizations have already been wiped out by single terrorists who’d been left alone to work unmonitored in their private laboratories.”
NATO Secretary General Jaap de Hoop Scheffer
Brad Sherman
U.S. House of Representatives (Democrat – California).
“This technology [nanotechnology] is every bit as explosive as nuclear weapons.”
Ray Solomonoff
founder of the branch of Artificial Intelligence based on machine learning, prediction, and probability. He was on our Scientific Advisory Board until his death.
“The Lifeboat problem becomes more and more critical as our technology ‘progresses’.”
StrategyPage
offers comprehensive bite-size summaries of military news and affairs on the Internet. They provide inside data on how and why things happen.
Jill Tarter
An American astronomer best known for her work on the search for extraterrestrial intelligence (SETI). Jill is the former director of the Center for SETI Research, holding the Bernard M. Oliver Chair for SETI at the SETI Institute. In 2002, Discover magazine recognized her as one of the 50 most important women in science. She is now a member of our Advisory Board.
“Your organization is inspiring and essential for all life on this planet.”
Ted Turner
American media visionary, philanthropist, and statesman.
“Hurricane Katrina drove home the staggering devastation that disasters — natural or man-made — can inflict. Meanwhile, July’s attacks on the London Underground reminded us terrorists can still strike major world cities. Now imagine the two joined together: terrorists, armed with weapons of mass destruction, unleashing Katrina-scale chaos and death in the heart of a U.S. city.”
“The risk of a Katrina-scale terrorist attack with Russian weapons is too critical to tolerate any delays to these crucial efforts. Congress must act and free us to meet what President Bush calls ‘the greatest threat before humanity today’.”
Neil deGrasse Tyson
Chairman of the Board of The Planetary Society.
US National Academy of Sciences
Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters.
Vernor Vinge
was a mathematician, computer scientist, and prophetic SF writer who predicted the Internet in 1981 and the Singularity in 1993.
Ken Wear
authored the site Rationallink.org and was on our Advisory Board until his death.
White House
US National Security Council.
White House official
speaking to the Washington Post.
Bob Woodward
has authored or coauthored eight No. 1 national nonfiction bestsellers, including four books on the presidency.
Jonathan Zittrain
cofounded the Berkman Center for Internet and Society at Harvard Law School and holds the Chair in Internet Governance and Regulation at the University of Oxford.