Monday, August 25, 2008

Threats to our Future

This post contains a list of what I consider to be the most serious threats to the future of humanity, as of today. It should be considered an incomplete and open-ended list; I'm certain that each of us can think of additional threats.

Note that I'm not considering threats to our civilization and/or way of life. There are many more of those. Rather, I want to limit this discussion to threats that can end all human life.

There are four main categories in two dimensions: the first dimension is simply natural disasters or man-made ones. The second dimension is things we can control versus things we can't. There may not be any entries in the list for "man made disasters we can't control", so perhaps we should qualify that as things we can't fix.

1) Natural Disasters we can't control/fix:

  • Nearby supernova or gamma ray burst or passing black hole: Nothing we can do about any of these, so don't worry about them.
  • Super Flare from the Sun: Our sun won't go nova or expand into a red giant for billions of years, but there is a possibility that it could have a major hiccup and blast the Earth with searing heat or sterilizing levels of radiation. Sea life would survive, as would any people lucky enough to be in submarines. It's too bad those tend to be men only; we also need women to save the species. Solution: co-ed submarine crews.
  • Large Igneous Event: These have caused mass extinctions in the past, and might in the future. Not much we can do here, either, as long as we all live on the surface of the Earth.

2) Natural Disasters we can control/fix:

  • Snowball Earth: At least twice in the Earth's history we have had global cooling so extreme that the oceans have completely frozen over, causing the loss of all surface life, and likely the loss of all oxygen. The solution is simple: Global Warming.
  • Comet or Asteroid Impact: There are millions of comets and asteroids large enough to destroy all human life, possibly all life period on the face of the Earth. Some of these will eventually strike the Earth; this is inevitable unless we take active steps to prevent such a catastrophe. We may have years or millennia before the big one hits, but we may not have enough advance notice to do something about it, unless we start building the infrastructure now. We need a very well-funded Space Watch program to find these objects years before they might obliterate us, and given enough advance notice, current technologies are likely to prove sufficient to avert disaster.

3) Man-made disasters we can't control/fix: (these are things that we create, but have no effective control over, and no natural defenses against)

  • Experiments gone wrong: While I'm a firm believer that nothing will go wrong when the LHC begins operation, I can't guarantee that all scientific experiments will have a similar result. If we knew the results in advance, we wouldn't need to perform the experiment now, would we? For example, if someone managed to create a nanometer-diameter black hole and drop it into the Earth, it would eat away at our planet and grow until it consumed the entire planet -- and there's not a damn thing we could do about it, even if we had hundreds or thousands of years before the disastrous end.

4) Man-made disasters we can control/fix.

  • Nanotech gone bad: While it may be remotely possible to create a self-replicating nanite that will reproduce until all possible resources are consumed, burying humanity in 3 feet of gray goo,this is so difficult that I'm not worried about it. We can't create a reasonably self-powered machine that could live off of the environment at present. We cannot build a complex small machine that can self-replicate. Or even a big machine. In any case, this problem is well described, and guidelines exist to insure that any replicating machine will have limits built into it (such as a critical and rare raw material).
  • Strong, malevolent AI (see The (likely coming) Technological Singularity) poses a very real threat, but one which may be difficult to realize, and one that we could choose to avoid by limiting computers to sub-human intelligence. Even if such an AI existed, there is a possibility of negotiation and co-existence so long as its intelligence and capabilities remain within the grasp of human understanding. But once we have made a machine significantly more intelligent than any human, we risk losing control. We will become the pets, to be neutered and/or put down at the convenience of the AI.
  • Biotech Terrorism: To me, this is the thing to worry about. It is completely within the realm of possibilities that a small group, even an individual, could tailor a virus or bacterium to create an airborne disease of unparalleled lethality, one that was immune to our natural defenses, one that could wipe us all out. My friend Jeff Carlson has written an excellent  techno-thriller (Plague Year) about an engineered viral organism that kills nearly all warm-blooded life on Earth, and the most unbelievable part is that it was designed with a weakness that could be exploited such that we might survive. What if those designers had made a mistake and the self-destruct mechanism failed? Or the bug evolved and the mechanism failed due to a minor mutation?

Did you note the traditional really big things that I don't think threaten humanity?

  • Nuclear War (and the threatened Nuclear Winter): Contrary to the hype we've all heard, we do not have enough nuclear weapons to destroy humanity, or even to create a nuclear winter. Many natural disasters release much more energy or release much more pollution. Yes, we do have the capability of destroying civilization as we know it, and even of killing more than 90% of humanity. But some of us will survive, live on, and rebuild civilization. An all-out nuclear war would merely set us back a few thousand years.
  • Global Warming: Warming up the Earth by 5 or 10 degrees would eventually melt the ice caps, raise the oceans by 200+ feet, drown coastal cities, states, even entire nations. It would radically disrupt the ecology, and hundreds or thousands of species would face extinction. The expense of dealing with such a catastrophe greatly exceeds trillions of dollars. But in the big picture, this is an inconvenience, a forced change. Human casualties would be in the noise range, likely fewer than the toll from malaria.
  • Overpopulation: Another serious problem, overpopulation has well-known natural controls: starvation and disease. Once half of everyone is dead, we no longer have a problem. Works for lemmings, too. The species survives. Note that the opposite problem, underpopulation, is much more serious (if it happens), because it is difficult to recover from the loss of genetic diversity. We'll lose some big cats (such as cheetahs) because they have insufficient genetic diversity to survive a nasty disease.

Please, propose your own threat to the future of humanity.

No comments: