Toggle light / dark theme

Quotes

Read some inspirational quotes!

9-11

9/11 Commission

an independent, bipartisan commission created by congressional legislation and the signature of President George W. Bush in late 2002. It is chartered to prepare a full and complete account of the circumstances surrounding the September 11, 2001 terrorist attacks, including preparedness for and the immediate response to the attacks.

“The greatest danger of another catastrophic attack in the United States will materialize if the world’s most dangerous terrorists acquire the world’s most dangerous weapons.”

Frank Abagnale

Frank W. Abagnale

was the master criminal whose autobiography Catch Me If You Can was turned into a film by Steven Spielberg starring Leonardo DiCaprio and Tom Hanks.

“It is important to remember that technology breeds crime, it always has it always will.”

Jahmal Ahmidan

Jamal Ahmidan

was a principal behind the 3/11 attacks in Spain that left 2000 dead or injured and Spain with a new government that retreated out of Iraq.

“We change states, we destroy others with Allah”s help and even decide the future of the world’s economy. We won’t accept being mere passive agents in this world.”

Abdullah Ahmad Badawi

Abdullah Ahmad Badawi

Prime Minster of Malaysia and Chairman of the 57-nation Organization of the Islamic Conference.

“The whole world is getting very disturbed. The frequency (of terrorist attacks) seems to be mounting.”

Nick Bostrom

Nick Bostrom

winner of a Templeton Foundation grant, cofounder of The World Transhumanist Association, and was director of the Future of Humanity Institute at the University of Oxford.

“At the present rate of scientific and technological progress, there is a real chance that we will have molecular manufacturing or superhuman artificial intelligence well within the first half of this century. Now, this creates some considerable promises and dangers. In a worst-case scenario, intelligent life could go extinct.”

“For example, if someone thought that a century-long ban on new technology were the only way to avoid a nanotechnological doomsday, she could still classify as a transhumanist, provided her opinion did not stem from a general technophobia … but was the result of a rational deliberation of the likely consequences of the possible policies.”

“The technology to produce a destructive nanobot seems considerably easier to develop than the technology to create an effective defense against such an attack (a global nanotech immune system, an ‘active shield’).”

“Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach — see what happens, limit damages, and learn from experience — is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions.”

“The Fermi Paradox refers to the question mark that hovers over the data point that we have seen no signs of extraterrestrial life. This tells us that it is not the case that life evolves on a significant fraction of Earth-like planets and proceeds to develop advanced technology, using it to colonize the universe in ways that would have been detected with our current instrumentation. There must be (at least) one Great Filter — an evolutionary step that is extremely improbable — somewhere on the line between Earth-like planet and colonizing-in-detectable-ways civilization. If the Great Filter isn’t in our past, we must fear it in our (near) future. Maybe nearly every civilization that develops a certain level of technology causes its own extinction.”

Arthur C. Clarke

Arthur C. Clarke

prophetic SF author who in 1945 predicted a world linked by geostationary satellites.

“This terrorism is a frightful danger and it is hard to see how we can get complete protection from it”.

DEBKAfile’s

DEBKAfile’s

War Diary is included in the US Library of Congress historic collection of 2003 War on Iraq on Internet. This online news source contains in-depth coverage of terrorism, security, political analysis, and espionage and is available in English and Hebrew.

“While the Americans focus on their war against insurgents in Iraq and the Israelis are caught up in fighting Palestinian terrorists, Al Qaeda is drawing a ring of fire around both.”

Eric Drexler

Eric Drexler

founder of the Foresight Institute, and founder of the nanotechnology movement.

“Foresight’s concern for the long-term potential abuse of nanotechnology has been confirmed and strengthened. Those who abuse technology — from airliners to anthrax — for destructive ends do exist and are unlikely to stop before full nanotech arrives, with all its power for both good and ill.”

“It would be easy to say, ‘let government or industry figure out how to prevent nanotech misuse,’ but the events of Sept. 11 and afterwards show this to be naive. (The current attempt to make airliners safer by keeping all sharp objects off the plane is laughable — a pair of glass eyeglasses is easily broken and used instead. The authorities dealing with the anthrax attacks expressed surprise that anthrax could leak from “sealed” envelopes — when anyone who’s ever licked one can see that the adhesive doesn’t extend to the flap’s edges.) Outside perhaps the military, government doesn’t do too well at anticipating emergencies and planning policies for them — their incentives are too political, and their time horizons are too short…”

“If extraterrestrial civilizations exist, and if even a small fraction were to behave as all life on Earth does, then they should by now have spread across space.”

“By now, after hundreds of millions of years, even widely scattered civilizations would have spread far enough to meet each other, dividing all of space among them.”

“An advanced civilization pushing its ecological limits would, almost by definition, not waste both matter and energy. Yet we see such waste in all directions, as far as we can see spiral galaxies: their spiral arms hold dust clouds made of wasted matter, backlit by wasted starlight… The idea that humanity is alone in the visible universe is consistent with what we see in the sky… Thus for now, and perhaps forever, we can make plans for our future without concern for limits imposed by other civilizations.”

Bill First

Bill First

U.S. Senate Majority Leader.

“Like everyone else, politicians tend to look away from danger, to hope for the best, and pray that disaster will not arrive on their watch even as they sleep through it. This is so much a part of human nature that it often goes unchallenged.

But we will not be able to sleep through what is likely coming soon — a front of unchecked and virulent epidemics, the potential of which should rise above your every other concern. For what the world now faces, it has not seen even in the most harrowing episodes of the Middle Ages or the great wars of the last century…

No intelligence agency, no matter how astute, and no military, no matter how powerful and dedicated, can assure that a few technicians of middling skill using a few thousand dollars worth of readily available equipment in a small and apparently innocuous setting cannot mount a first-order biological attack.

It’s possible today to synthesize virulent pathogens from scratch, or to engineer and manufacture prions that, introduced undetectably over time into a nation’s food supply, would after a long delay afflict millions with a terrible and often fatal disease. It’s a new world…

So what must we do?

I propose an unprecedented effort — a “Manhattan Project for the 21st Century” — not with the goal of creating a destructive new weapon, but to defend against destruction wreaked by infectious disease and biological weapons…

This is a bold vision. But it is the kind of thing that, once accomplished, is done. And it is the kind of thing that calls out to be done — and that, if not done, will indict us forever in the eyes of history.

In diverting a portion of our vast resources to protect nothing less than our lives, the lives of our children, and the life of our civilization, many benefits other than survival would follow in train — not least the satisfaction of having done right.”

Rudolph Giuliani

Rudolph Giuliani

mayor of New York City when it was attacked on 9/11.

“The most dangerous situation is where you’re facing peril but you’re not aware of it.”

Julian Haight

Julian Haight

president of SpamCop.net, the premier spam reporting service.

“I’m getting bombed off the face of the Earth and no one cares.”

Robert A. Heinlein

Robert A. Heinlein

was an influential and controversial science fiction author. The English language absorbed several words from his fiction, including “grok”, meaning “to understand so thoroughly that the observer becomes part of the observed.”

“The Earth is just too small and fragile a basket for the human race to keep all its eggs in.”

Ray Kurzweil

Ray Kurzweil

prophetic author of the 1990 book The Age of Intelligent Machines where he correctly predicted advancements in AI. He was also the principal developer of the first omni-font optical character recognition, the first print-to-speech reading machine for the blind, the first CCD flat-bed scanner, and the first commercially marketed large-vocabulary speech recognition. He is a member of the U.S. Army Science Advisory Group, our 2005 Guardian Award winner, and is on our Scientific Advisory Board.

“…the means and knowledge will soon exist in a routine college bioengineering lab (and already exists in more sophisticated labs) to create unfriendly pathogens more dangerous than nuclear weapons.”

“I advocate a one hundred billion dollar program to accelerate the development of technologies to combat biological viruses.”

“We have an existential threat now in the form of the possibility of a bioengineered malevolent biological virus. With all the talk of bioterrorism, the possibility of a bioengineered bioterrorism agent gets little and inadequate attention. The tools and knowledge to create a bioengineered pathogen are more widespread than the tools and knowledge to create an atomic weapon, yet it could be far more destructive. I’m on the Army Science Advisory Group (a board of five people who advise the Army on science and technology), and the Army is the institution responsible for the nation’s bioterrorism protection. Without revealing anything confidential, I can say that there is acute awareness of these dangers, but there is neither the funding nor national priority to address them in an adequate way.”

“The decision by the U.S. Department of Health & Human Services to publish the full genome of the 1918 influenza virus on the Internet in the GenBank database is extremely dangerous and immediate steps should be taken to remove this data.”

“Grey goo certainly represents power — destructive power — and if such an existential threat were to prevail, it would represent a catastrophic loss… Although the existential nanotechnology danger is not yet at hand, denial is not the appropriate strategy.”

“A self-replicating pathogen, whether biological or nanotechnology based, could destroy our civilization in a matter of days or weeks.”

“We can envision a more insidious possibility. In a two-phased attack, the nanobots take several weeks to spread throughout the biomass but use up an insignificant portion of the carbon atoms, say one out of every thousand trillion (1015). At this extremely low level of concentration, the nanobots would be as stealthy as possible. Then, at an ‘optimal’ point, the second phase would begin with the seed nanobots expanding rapidly in place to destroy the biomass. For each seed nanobot to multiply itself a thousand trillionfold would require only about 50 binary replications, or about 90 minutes.”

“Recall that biological evolution is measured in millions and billions of years. So if there are other civilizations out there, they would be spread out in terms of development by huge spans of time. The SETI assumption implies that there should be billions of ETIs (among all the galaxies), so there should be billions that lie far ahead of us in their technological progress. Yet it takes only a few centuries at most from the advent of computation for such civilizations to expand outward at at least light speed. Given this, how can it be that we have not noticed them? The conclusion I reach is that it is likely (although not certain) that there are no such other civilizations.”

“To this day, I remain convinced of this basic philosophy: no matter what quandries we face — business problems, health issues, relationship difficulties, as well as the great scientific, social, and cultural challenges of our time — there is an idea that can enable us to prevail. Furthermore, we can find that idea. And when we find it, we need to implement it. My life has been shaped by this imperative. The power of an idea — this is itself an idea.”

Ken Livingstone

Ken Livingstone

mayor of London, said the following after Al Qaeda attacked Spain (and before they attacked London in the worst attack on London since World War II).

“It would be miraculous if, with all the terrorist resources arranged against us, terrorists did not get through, and given that some are prepared to give their own lives, it would be inconceivable that someone does not get through to London.”

Kelvin G. Lynn

Kelvin G. Lynn

Director of Center for Materials Research at Washington State University. Dr. Lynn has developed an “antimatter trap” that the U.S. Air Force is considering as the basis of an antimatter bomb which would be over 1,000 times as powerful as an H-bomb.

“I think we need to get off this planet, because I’m afraid we’re going to destroy it.”

Baron Martin Rees

Baron Martin Rees

Royal Society Professor at Cambridge University, a Fellow of Kings College, and the U.K.’s Astronomer Royal. The winner of the 2001 Cosmology Prize of the Peter Gruber Foundation and our 2004 Guardian Award, he has published numerous academic papers and books including Our Final Hour: A Scientist’s Warning: How Terror, Error, and Environmental Disaster Threaten Humankind’s Future In This Century — On Earth and Beyond. He is a member of our Scientific Advisory Board.

“Science is advancing faster than ever, and on a broader front… But there is a dark side: new science can have unintended consequences; it empowers individuals to perpetrate acts of megaterror; even innocent errors could be catastrophic. The ‘downside’ from twenty-first century technology could be graver and more intractable than the threat of nuclear devastation that we have faced for decades.”

“If there were millions of independent fingers on the button of a Doomsday machine, then one person’s act of irrationality, or even one person’s error, could do us all in.”

“Biotechnology is advancing rapidly, and by 2020 there will be thousands — even millions — of people with the capability to cause a catastrophic biological disaster. My concern is not only organized terrorist groups, but individual weirdos with the mindset of the people who now design computer viruses. Even if all nations impose effective regulations on potentially dangerous technologies, the chance of an active enforcement seems to me as small as in the case of the drug laws.”

“We can ask of any innovation whether its potential is so scary that we should be inhibited in pressing on with it, or at least impose some constraints. Nanotechnology, for instance, is likely to transform medicine, computers, surveillance, and other practical areas, but it might advance to a stage at which a replicator, with its associated dangers, became technically feasible. There would then be the risk, as there now is with biotechnology, of a catastrophic ‘release’ (or that the technique could be used as a ‘suicide weapon’).”

“To put effective brakes on a field of research would require international consensus. If one country alone imposed regulations, the most dynamic researchers and enterprising companies would simply move to another country, something that is happening already in stem cell research. And even if all governments agreed to halt research in a particular field, the chances of effective enforcement are slim.”

“Even if all the world’s scientific academics agreed that some specific lines of inquiry had a disquieting ‘downside’ and all countries, in unison, imposed a formal prohibition, then how effectively could it be enforced? An international moratorium could certainly slow down particular lines of research, even if they couldn’t be stopped completely. When experiments are disallowed for ethical reasons, enforcement with ninety-nine percent effectiveness, or even just ninety percent, is far better than having no prohibition at all; but when experiments are exceedingly risky, enforcement would need to be close to one hundred percent effective to be reassuring: even one release of a lethal virus could be catastrophic, as could a nanotechnology disaster.

Despite all the efforts of law enforcers, millions of people use illicit drugs; thousands peddle them. In view of the failure to control drug smuggling or homicides, it is unrealistic to expect that when the genie is out of the bottle, we can ever be fully secure against bioerror and bioterror: risk would still remain that could not be eliminated except by measures that are themselves unpalatable, such as intrusive universal surveillance.”

“It is not inconceivable that physics could be dangerous too. Some experiments are designed to generate conditions more extreme than ever occur naturally. Nobody then knows exactly what will happen. Indeed, there would be no point in doing any experiments if their outcomes could be fully predicted in advance. Some theorists have conjectured that certain types of experiment could conceivably unleash a runaway process that destroyed not just us but Earth itself.”

“More ominously, there could be a crucial hurdle at our own present evolutionary stage, the state when intelligent life starts to develop technology. If so, the future development of life depends on whether humans survive this phase.”

“Suppose that we had a fateful decision that would determine whether the species might soon be extinguished, or else whether it would survive almost indefinitely. For instance, this might be the choice of whether to foster the first community away from Earth, which, once established, would spawn so many others that one would be guaranteed to survive.”

“Even a few pioneering groups, living independently of Earth, would offer a safeguard against the worst possible disaster — the foreclosure of intelligent life’s future through the extinction of all humankind.

The ever-present slight risk of a global catastrophe with a ‘natural’ cause will be greatly augmented by the risks stemming from twenty-first-century technology. Humankind will remain vulnerable so long as it stays confined here on Earth. Is it worth insuring against not just natural disasters by the probably much larger (and certainly growing) risk of human-induced catastrophes? Once self-sustaining communities exist away from Earth — on the Moon, on Mars, or freely floating in space — our species would be invulnerable to even the worst global disasters.”

“Once the threshold is crossed when there is a self-sustaining level of life in space, then life’s long-range future will be secure irrespective of any of the risks on Earth. Will this happen before our technical civilization disintegrates, leaving this as a might-have-been? Will the self-sustaining space communities be established before a catastrophe sets back the prospect of any such enterprise, perhaps foreclosing it forever? We live at what could be a defining moment for the cosmos, not just for our Earth.”

“What happens here on Earth, in this century, could conceivably make the difference between a near eternity filled with ever more complex and subtle forms of life and one filled with nothing but base matter.”

Carl Sagan

Carl Sagan

American astronomer, planetologist, biologist, and popularizer of science and space research.

“All civilizations become either spacefaring or extinct.”

Marshall T. Savage

Marshall T. Savage

author of The Millennial Project: Colonizing The Galaxy In Eight Easy Steps.

“Perhaps advanced civilizations don’t use radio, or radar, or microwaves. Advanced technology can be invoked as an explanation for the absence of extra terrestrial radio signals. But is seems unlikely that their technology would leave no imprint anywhere in the electromagnetic spectrum. We have been compared to the aborigine who remains blissfully unaware of the storm of radio and TV saturating the airwaves around him. Presumably, the aliens use advanced means of communications which we cannot detect. What these means might be is, by definition, unknown, but they must be extremely exotic. We don’t detect K2 signals in the form of laser pulses, gamma rays, cosmic rays, or even neutrinos. Therefore the aliens must use some system we haven’t even imagined.

This argument, appealing though it is, cannot survive contact with Occam’s razor — in this case Occam’s machete. The evidence in hand is simply nothing — no signals. To explain the absence of signals in the presence of aliens, demands recourse to what is essentially magic. Unfortunately, the iron laws of logic demand that we reject such wishful thinking in favor of the simplest explanation which fits the data: No signals; no aliens.

The skies are thunderous in their silence; the Moon eloquent in its blankness; the aliens are conclusive in their absence. The extraterrestrials aren’t here. They’ve never been here. They’re never coming here. They aren’t coming because they don’t exist. We are alone.”

“Now is the watershed of Cosmic history. We stand at the threshold of the New Millennium. Behind us yawn the chasms of the primordial past, when this universe was a dead and silent place; before us rise the broad sunlit uplands of a living cosmos. In the next few galactic seconds, the fate of the universe will be decided. Life — the ultimate experiment — will either explode into space and engulf the star-clouds in a fire storm of children, trees, and butterfly wings; or Life will fail, fizzle, and gutter out, leaving the universe shrouded forever in impenetrable blankness, devoid of hope.

Teetering here on the fulcrum of destiny stands our own bemused species. The future of the universe hinges on what we do next. If we take up the sacred fire, and stride forth into space as the torchbearers of Life, this universe will be aborning. If we carry the green fire-brand from star to star, and ignite around each a conflagration of vitality, we can trigger a Universal metamorphosis. Because of us, the barren dusts of a million billion worlds will coil up into the pulsing magic forms of animate matter. Because of us, landscapes of radiation blasted waste, will be miraculously transmuted: Slag will become soil, grass will sprout, flowers will bloom, and forests will spring up in once sterile places. Ice, hard as iron, will melt and trickle into pools where starfish, anemones, and seashells dwell — a whole frozen universe will thaw and transmogrify, from howling desolation to blossoming paradise. Dust into Life; the very alchemy of God.

If we deny our awesome challenge; turn our backs on the living universe, and forsake our cosmic destiny, we will commit a crime of unutterable magnitude. Mankind alone has the power to carry out this fundamental change in the universe. Our failure would lead to consequences unthinkable. This is perhaps the first and only chance the universe will ever have to awaken from its long night and live. We are the caretakers of this delicate spark of Life. To let it flicker and die through ignorance, neglect, or lack of imagination is a horror too great to contemplate.”

Brad Sherman

Brad Sherman

U.S. House of Representatives (Democrat – California).

“This technology [nanotechnology] is every bit as explosive as nuclear weapons.”

Ray Solomonoff

Ray Solomonoff

founder of the branch of Artificial Intelligence based on machine learning, prediction, and probability. He was on our Scientific Advisory Board until his death.

“The Lifeboat problem becomes more and more critical as our technology ‘progresses’.”

Jill Tarter

Jill Tarter

An American astronomer best known for her work on the search for extraterrestrial intelligence (SETI). Jill is the former director of the Center for SETI Research, holding the Bernard M. Oliver Chair for SETI at the SETI Institute. In 2002, Discover magazine recognized her as one of the 50 most important women in science. She is now a member of our Advisory Board.

“Your organization is inspiring and essential for all life on this planet.”