World Library  
Flag as Inappropriate
Email this Article

Global catastrophic risk

Article Id: WHEBN0021221594
Reproduction Date:

Title: Global catastrophic risk  
Author: World Heritage Encyclopedia
Language: English
Subject: Pole shift hypothesis, Earth Changes, Human extinction, Future of Humanity Institute, Societal impact of nanotechnology
Collection: Doomsday Scenarios, Eschatology, Futurology, Hazards, Human Extinction, Risk Analysis
Publisher: World Heritage Encyclopedia

Global catastrophic risk

Artist's impression of a major asteroid impact. An asteroid with an impact strength of a billion atomic bombs may have caused the extinction of the dinosaurs.[1]

A global catastrophic risk is a hypothetical future event with the potential to inflict serious damage to human well-being on a global scale.[2] Some such events could destroy or cripple modern civilization. Other, even more severe, scenarios threaten permanent human extinction.[3] These are referred to as existential risks.

Natural disasters, such as supervolcanoes and asteroids, pose such risks if sufficiently powerful. Human-caused, or anthropogenic, events could also threaten the survival of intelligent life on Earth. Such anthropogenic events could include catastrophic global warming,[4] nuclear war, or bioterrorism. The Future of Humanity Institute believes that human extinction is more likely to result from anthropogenic causes than natural causes.[5][6]

Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before.[7] While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias.


  • Classifications of risk 1
  • Probability of an existential catastrophe 2
    • Fermi paradox 2.1
  • Moral importance of existential risk 3
  • Potential sources of risk 4
    • Anthropogenic 4.1
      • Artificial intelligence 4.1.1
      • Nanotechnology 4.1.2
      • Biotechnology 4.1.3
      • Warfare and mass destruction 4.1.4
      • Global warming 4.1.5
      • Ecological disaster 4.1.6
      • World population and agricultural crisis 4.1.7
      • Experimental technology accident 4.1.8
    • Non-anthropogenic 4.2
      • Global pandemic 4.2.1
      • Climate change 4.2.2
        • Ice age
      • Volcanism 4.2.3
      • Megatsunami 4.2.4
      • Geomagnetic reversal 4.2.5
      • Asteroid impact 4.2.6
      • Extraterrestrial invasion 4.2.7
      • Cosmic threats 4.2.8
  • Discredited scenarios 5
  • Precautions and prevention 6
    • Existential risk reduction organizations 6.1
    • Global catastrophic risk reduction organizations 6.2
  • See also 7
  • Notes 8
  • References 9
  • Further reading 10
  • External links 11

Classifications of risk

Scope/intensity grid from Bostrom's paper "Existential Risk Prevention as Global Priority"[8]

The philosopher Nick Bostrom classifies risks according to their scope and intensity.[6] He considers risks that are at least global in scope and "endurable" in intensity to be global catastrophic risks. Those that are at least pan-generational (affecting all future generations) in scope and "crushing" in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity entirely or prevents any chance of civilization recovering. Bostrom considers existential risks to be far more significant.[9]

Bostrom identifies four types of existential risk. "Bangs" are sudden catastrophes, which may be accidental or deliberate. He thinks the most likely sources of bangs are malicious use of nanotechnology, nuclear war, and the possibility that the universe is a simulation that will end. "Crunches" are scenarios in which humanity survives but civilization is irreversibly destroyed. The most likely causes of this, he believes, are exhaustion of natural resources, a stable global government that prevents technological progress, or dysgenic pressures that lower average intelligence. "Shrieks" are undesirable futures. For example, if a single mind enhances its powers by merging with a computer, it could dominate human civilization, which could be bad. Bostrom believes that this scenario is most likely, followed by flawed superintelligence and a repressive totalitarian regime. "Whimpers" are the gradual decline of human civilization or current values. He thinks the most likely cause would be evolution changing moral preference, followed by extraterrestrial invasion.[3]

Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner singles out such events as worthy of special attention on cost-benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.[10] Posner's events include meteor impacts, runaway global warming, grey goo, bioterrorism, and particle accelerator accidents.

Probability of an existential catastrophe

The following are examples of individuals and institutions that have made probability predictions about existential events. Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century,[11] have had their probabilities predicted with considerable precision (though some scholars claim the actual rate of large impacts could be much higher than originally calculated).[12] Similarly, the frequency of volcanic eruptions of sufficient magnitude to cause catastrophic climate change, similar to the Toba Eruption, which may have almost caused the extinction of the human race,[13] has been estimated at about 1 in every 50,000 years.[14] The relative danger posed by other threats is much more difficult to calculate. In 2008, a group of "experts on different global catastrophic risks" at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19% chance of human extinction over the next century. However, the conference report cautions that the methods used to average responses to the informal survey is suspect due to the treatment of non-responses. The probabilities estimated for various causes are summarized below.

Risk Probability of human extinction before 2100
Molecular nanotechnology weapons 5%
Superintelligent AI 5%
Wars 4%
Engineered pandemic 2%
Nuclear war 1%
Nanotechnology accident 0.5%
Natural pandemic 0.05%
Nuclear terrorism 0.03%
Table source: Future of Humanity Institute, 2008.[15]

There are significant methodological challenges in estimating these risks with precision. Most attention has been given to risks to human civilization over the next 100 years, but forecasting for this length of time is difficult. The types of threats posed by nature may prove relatively constant, though new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have only been an issue since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited. Man-made threats such as nuclear war or nanotechnology are harder to predict than natural threats, due to the inherent methodological difficulties in the social sciences. In general, it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.

Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects. Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history.[7] These anthropic issues can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.[8]

Fermi paradox

In 1950 Enrico Fermi, the Italian physicist, wondered why humans had not yet encountered extraterrestrial civilizations. He asked, “Where is everybody?”[16] Given the age of the universe and its vast number of stars, unless the Earth is very atypical, extraterrestrial life should be common. So why was there no evidence of extraterrestrial civilizations? This is known as the Fermi paradox.

One of the many proposed reasons, though not widely accepted, that humans have not yet encountered intelligent life from other planets (aside from the possibility that it does not exist), could be due to the probability of existential catastrophes. Namely, other potentially intelligent civilizations have been wiped out before humans could find them or they could find Earth.[7][17][18]

Moral importance of existential risk

Some scholars have strongly favored reducing existential risk on the grounds that it greatly benefits future generations. Derek Parfit argues that extinction would be a great loss because our descendants could potentially survive for a billion years before the expansion of the Sun makes the Earth uninhabitable.[19][20] Bostrom argues that there is even greater potential in colonizing space. If future humans colonize space, they may be able to support a very large number of people on other planets, potentially lasting for trillions of years.[9] Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people that will exist in the future.

Little has been written arguing against these positions, but some scholars would disagree. Exponential discounting might make these future benefits much less significant. Gaverick Matheny has argued that such discounting is inappropriate when assessing the value of existential risk reduction.[11]

Some economists have discussed the importance of global catastrophic risks, though not existential risks. Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.[4] Richard Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale catastrophes.[21]

Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they’re willing to give does not increase linearly with the magnitude of the issue: people are as concerned about 200,000 birds getting stuck in oil as they are about 2,000.[22] Similarly, people are often more concerned about threats to individuals than to larger groups.[23]

Potential sources of risk

Existential risks, and other risks to civilization, may come from natural or man-made sources. It has been argued that many existential risks are currently unknown.[24]


Some potential existential risks are consequences of man-made technologies.

In 2012, Cambridge University created The Cambridge Project for Existential Risk which examines threats to humankind caused by developing technologies.[25] The stated aim is to establish within the University a multidisciplinary research centre, Centre for the Study of Existential Risk, dedicated to the scientific study and mitigation of existential risks of this kind.[25]

The Cambridge Project states that the "greatest threats" to the human species are man-made, they are artificial intelligence, global warming, nuclear war and rogue biotechnology.[26]

Artificial intelligence

It has been suggested that

  • Last Days On Earth (TV documentary) ABC News 2-hour Special Edition of 20/20 on 7 real end-of-the-world scenarios (Wed. Aug 30 2006)
  • "What a way to go" from The Guardian. Ten scientists name the biggest danger to Earth and assesses the chances of it happening. April 14, 2005.
  • "Confronting the New Misanthropy", by Frank Furedi in Spiked, April 18, 2006
  • (video) - Stephen Petranek: 10 ways the world could end
  • Armageddon Online, A collection of doomsday scenarios and daily news
  • Doomsday Guide, a directory devoted to end times theories
  • Top 10 Ways to Destroy Earth
  • Several potential world ending scenarios
  • "Countdown to Doomsday with Today Show Host Matt Lauer".  
  • [1] - A website about existential risk by Nick Bostrom.
  • Cognitive biases potentially affecting judgment of global risks - A paper by Eliezer Yudkowsky discussing how various observed cognitive biases hamper our judgement of existential risk.
  • Why the future doesn't need us,, April 2000 - Bill Joy's influential call to relinquish dangerous technologies.
  • Being present in the face of existential threat: The role of trait mindfulness in reducing defensive responses to mortality salience.

External links

Further reading


  1. ^ Schulte, P. et al. (5 March 2010). "The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous-Paleogene Boundary".  
  2. ^  
  3. ^ a b  
  4. ^ a b Weitzman, Martin (2009). "On modeling and interpreting the economics of catastrophic climate change". The Review of Economics and Statistics 91 (1): 1–19.  
  5. ^ "Frequently Asked Questions". Existential Risk. Future of Humanity Institute. Retrieved 26 July 2013. 
  6. ^ a b Bostrom, Nick. "Existential Risk Prevention as a Global Priority". Existential Risk. Future of Humanity Institute. Retrieved 23 July 2013. 
  7. ^ a b c Observation Selection Effects and Global Catastrophic Risks, Milan Cirkovic, 2008
  8. ^ a b Bostrom, N (2013). "Existential Risk Prevention as Global Priority". Global Policy 4.  
  9. ^ a b Bostrom, Nick. "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas 15 (3): 308–314.  
  10. ^ Posner, Richard A. (2006). Catastrophe : risk and response. Oxford: Oxford University Press.  , Introduction, "What is Catastrophe?"
  11. ^ a b Matheny, Jason Gaverick (2007). "Reducing the Risk of Human Extinction". Risk Analysis 27 (5). 
  12. ^ Asher, D.J., Bailey, M.E., Emel’yanenko, V., and Napier, W.M. (2005). Earth in the cosmic shooting gallery. *The Observatory*, 125, 319-322.
  13. ^ Ambrose 1998; Rampino & Ambrose 2000, pp. 71, 80.
  14. ^ Rampino, M.R. and Ambrose, S.H. (2002). Super eruptions as a threat to civilizations on Earth-like planets. *Icarus*, 156, 562-569
  15. ^ Global Catastrophic Risks Survey, Technical Report, 2008, Future of Humanity Institute
  16. ^ Jones, E. M. (March 1, 1985). ""Where is everybody?" An account of Fermi's question"".  
  17. ^ Ventrudo, Brian (5 June 2009). "So Where Is ET, Anyway?".  
  18. ^ Vinn, O (2014). "Potential incompatibility of inherited behavior patterns with civilization". PublishResearch: 1–3. Retrieved 2014-03-05. 
  19. ^ Parfit, Derek (1984). Reasons and Persons. Oxford University Press. pp. 453–454. 
  20. ^
  21. ^ Posner, Richard (2004). Catastrophe: risk and response. Oxford University Press. 
  22. ^ Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), *Contingent Valuation:A Critical Assessment,* pp. 91−159 (Amsterdam: North Holland).
  23. ^ Eliezer Yudkowsky, 2008, Cognitive Biases potentially affecting judgments of global risks
  24. ^ Karnofsky, Holden. "Possible Global Catastrophic Risks". GiveWell Blog. GiveWell. Retrieved 24 July 2013. 
  25. ^ a b "The Cambridge Project for Existential Risk". Cambridge University. 
  26. ^ Terminator center' to open at Cambridge University"'". Fox News. 2012-11-26. 
  27. ^ Bill Joy, Why the future doesn't need us. In:Wired magazine. See also technological singularity.Nick Bostrom 2002 Ethical Issues in Advanced Artificial Intelligence
  28. ^ a b Nick Bostrom 2002 Ethical Issues in Advanced Artificial Intelligence
  29. ^ a b Scientists Worry Machines May Outsmart Man By JOHN MARKOFF, NY Times, July 26, 2009.
  30. ^ The Coming Technological Singularity: How to Survive in the Post-Human Era, by Vernor Vinge, Department of Mathematical Sciences, San Diego State University, (c) 1993 by Vernor Vinge.
  31. ^ Gaming the Robot Revolution: A military technology expert weighs in on Terminator: Salvation., By P. W. Singer, Thursday, May 21, 2009.
  32. ^ Robot takeover,
  33. ^ robot page,
  34. ^ a b Yudkowsky, Eliezer. "Artificial Intelligence as a Positive and Negative Factor in Global Risk". Retrieved 26 July 2013. 
  35. ^ Call for debate on killer robots, By Jason Palmer, Science and technology reporter, BBC News, 8/3/09.
  36. ^ Robot Three-Way Portends Autonomous Future, By David Axe, August 13, 2009.
  37. ^ New Navy-funded Report Warns of War Robots Going "Terminator", by Jason Mick (Blog),, February 17, 2009.
  38. ^ Navy report warns of robot uprising, suggests a strong moral compass, by Joseph L. Flatley, Feb 18th 2009.
  39. ^ New role for robot warriors; Drones are just part of a bid to automate combat. Can virtual ethics make machines decisionmakers?, by Gregory M. Lamb / Staff writer, Christian Science Monitor, February 17, 2010.
  40. ^ "The Rise of Artificial Intelligence".  
  41. ^ a b c d e f g h i j k l m n Chris Phoenix; Mike Treder (2008). "Chapter 21: Nanotechnology as global catastrophic risk". In Bostrom, Nick; Cirkovic, Milan M. Global catastrophic risks. Oxford: Oxford University Press.  
  42. ^ a b "Frequently Asked Questions - Molecular Manufacturing". Retrieved 19 July 2014. 
  43. ^ Drexler, Eric. "A Dialog on Dangers". Retrieved 19 July 2014. 
  44. ^ a b Sandberg, Anders. "The five biggest threats to human existence". Retrieved 13 July 2014. 
  45. ^ Drexler, Eric. "ENGINES OF DESTRUCTION (Chapter 11)". Retrieved 19 July 2014. 
  46. ^ Tomasik, Brian. "Possible Ways to Promote Compromise". Retrieved 19 July 2014. 
  47. ^ "Dangers of Molecular Manufacturing". Retrieved 19 July 2014. 
  48. ^ a b "The Need for International Control". Retrieved 19 July 2014. 
  49. ^ Tomasik, Brian. "International Cooperation vs. AI Arms Race". Retrieved 19 July 2014. 
  50. ^ "Technical Restrictions May Make Nanotechnology Safer". Retrieved 19 July 2014. 
  51. ^ Tomasik, Brian. "Possible Ways to Promote Compromise". Retrieved 22 July 2014. 
  52. ^ Joseph, Lawrence E. (2007). Apocalypse 2012. New York: Broadway. p. 6.  
  53. ^ Rincon, Paul (2004-06-09). "'"Nanotech guru turns back on 'goo. BBC News. Retrieved 2012-03-30. 
  54. ^ Hapgood, Fred (November 1986). "Nanotechnology: Molecular Machines that Mimic Life". Omni. Retrieved 19 July 2014. 
  55. ^ "Leading nanotech experts put 'grey goo' in perspective". Retrieved 19 July 2014. 
  56. ^ a b c d e f g h i j Ali Noun; Christopher F. Chyba (2008). "Chapter 20: Biotechnology and biosecurity". In Bostrom, Nick; Cirkovic, Milan M. Global Catastrophic Risks. Oxford University Press. 
  57. ^ a b c Frank SA (March 1996). "Models of parasite virulence". Q Rev Biol 71 (1): 37–78.  
  58. ^ Jackson, Ronald J.; Ramsay, Alistair J.; Christensen, Carina D.; Beaton, Sandra; Hall, Diana F.; Ramshaw, Ian A. (2001). "Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox". Journal of virology 75 (3): 1205–1210.  
  59. ^
  60. ^ Nuclear Weapons and the Future of Humanity: The Fundamental Questions by Avner Cohen, Steven Lee - page 237
  61. ^ https://articles/List_of_states_with_nuclear_weapons
  62. ^ https://articles/List_of_states_with_nuclear_weapons#Statistics_and_force_configuration
  63. ^ a b Martin, Brian (1982). "Critique of nuclear extinction". Journal of Peace Research 19 (4): 287–300.  
  64. ^ Shulman, Carl (5 Nov 2012). "Nuclear winter and human extinction: Q&A with Luke Oman". Overcoming Bias. Retrieved 25 October 2014. 
  65. ^
  66. ^ Bostrom 2002, section 4.2.
  67. ^ Isaac M. Held, Brian J. Soden, Water Vapor Feedback and Global Warming, In: Annu. Rev. Energy Environ 2000. available online. Page 449.
  68. ^ World Lines: Pathways, Pivots, and the Global Future. Paul Raskin. 2006. Boston:Tellus Institute
  69. ^ Dawn of the Cosmopolitan: The Hope of a Global Citizens Movement Orion Kriegman. 2006. Boston:Tellus Institute
  70. ^ Chiarelli, B. (1998). "Overpopulation and the Threat of Ecological Disaster: the Need for Global Bioethics". Mankind Quarterly 39 (2): 225–230. 
  71. ^ Evans-Pritchard, Ambrose (6 February 2011). "Einstein was right - honey bee collapse threatens global food security".  
  72. ^ Lovgren, Stefan. "Mystery Bee Disappearances Sweeping U.S." National Geographic News. URL accessed March 10, 2007.
  73. ^ "The end of India's green revolution?". BBC News. 2006-05-29. Retrieved 2012-01-31. 
  74. ^ Posted April 8th, 2000 by admin (2000-04-08). "Food First/Institute for Food and Development Policy". Retrieved 2012-01-31. 
  75. ^ "How peak oil could lead to starvation". 2009-05-27. Retrieved 2012-01-31. 
  76. ^ "Eating Fossil Fuels". 2003-10-02. Retrieved 2012-01-31. 
  77. ^ The Oil Drum: Europe. "Agriculture Meets Peak Oil". Retrieved 2012-01-31. 
  78. ^
  79. ^ "Cereal Disease Laboratory : Ug99 an emerging virulent stem rust race". Retrieved 2012-01-31. 
  80. ^ "Durable Rust Resistance in Wheat". Retrieved 2012-01-31. 
  81. ^ Bostrom 2002, section 4.8
  82. ^ Richard Hamming. "Mathematics on a Distant Planet". 
  83. ^ "Report LA-602, ''Ignition of the Atmosphere With Nuclear Bombs''" (PDF). Retrieved 2011-10-19. 
  84. ^ New Scientist, 28 August 1999: "A Black Hole Ate My Planet"
  85. ^  
  86. ^
  87. ^ "Safety at the LHC". 
  88. ^ J. Blaizot et al., "Study of Potentially Dangerous Events During Heavy-Ion Collisions at the LHC", CERN library record CERN Yellow Reports Server (PDF)
  89. ^ Eric Drexler, Engines of Creation, ISBN 0-385-19973-2, available online
  90. ^ Brown NF, Wickham ME, Coombes BK, Finlay BB (May 2006). "Crossing the Line: Selection and Evolution of Virulence Traits".  
  91. ^ Ebert D, Bull JJ (January 2003). "Challenging the trade-off model for the evolution of virulence: is virulence management feasible?". Trends Microbiol. 11 (1): 15–20.  
  92. ^ André JB, Hochberg ME (July 2005). "Virulence evolution in emerging infectious diseases". Evolution 59 (7): 1406–12.  
  93. ^ Gandon S (March 2004). "Evolution of multihost parasites". Evolution 58 (3): 455–69.  
  94. ^ "Near Apocalypse Causing Diseases, a Historical Look:". Retrieved 2012-05-05. 
  95. ^ Kate Ravilious (2005-04-14). "What a way to go". The Guardian. 
  96. ^ 2012 Admin (2008-02-04). "Toba Supervolcano". 2012 Final Fantasy. 
  97. ^ Science Reference. "Toba Catastrophe Theory". Science Daily. 
  98. ^ Breining, Greg (2007). Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. Voyageur Press. p. 256.  
  99. ^ Breining, Greg (2007). "Distant Death". Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. St. Paul, MN.: Voyageur Press. p. 256 pg.  
  100. ^ a b Breining, Greg (2007). "The Next Big Blast". Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. St. Paul, MN.: Voyageur Press. p. 256 pg.  
  101. ^ US West Antarctic Ice Sheet initiative
  102. ^ Plotnick, Roy E. (1 January 1980). "Relationship between biological extinctions and geomagnetic reversals". Geology 8 (12): 578.  
  103. ^ Glassmeier, Karl-Heinz; Vogt, Joachim (29 May 2010). "Magnetic Polarity Transitions and Biospheric Effects". Space Science Reviews 155 (1-4): 387–410.  
  104. ^ U.S.Congress (19 March 2013 and 10 April 2013). "Threats From Space: a Review of U.S. Government Efforts to Track and mitigate Asteroids and Meteors (Part I and Part II) - Hearing Before the Committee on Science, Space, and Technology House of Representatives One Hundred Thirteenth Congress First Session".  
  105. ^ a b Bostrom 2002, section 4.10
  106. ^ García-Sánchez, Joan et al. (February 1999). "Stellar Encounters with the Oort Cloud Based on HIPPARCOS Data". The Astronomical Journal 117 (2): 1042–1055.  
  107. ^ Twenty ways the world could end suddenly, Discover Magazine
  108. ^ Urban Legends Reference Pages: Legal Affairs (E.T. Make Bail)
  109. ^ Bostrom 2002, section 7.2
  110. ^ Ken Croswell, Will Mercury Hit Earth Someday?, April 24, 2008, accessed April 26, 2008
  111. ^ Explosions in Space May Have Initiated Ancient Extinction on Earth, NASA.
  112. ^ Bostrom 2002, section 4.7
  113. ^ Wanjek, Christopher (2005-04-06). "Explosions in Space May Have Initiated Ancient Extinction on Earth".  
  114. ^ Melott, A.L. and Thomas, B.C. (2011). "Astrophysical Ionizing Radiation and the Earth: A Brief Review and Census of Intermittent Intense Sources". Astrobiology 11: 343–361.  
  115. ^ Fraser Cain (2003-08-04). "Local Galactic Dust is on the Rise". Universe Today. 
  116. ^ "Apocalypse 2012 - Tall tales that the End of Days is coming in 2012." by Brian Dunning
  117. ^ "Mankind must abandon earth or face extinction: Hawking",, August 9, 2010, retrieved 2012-01-23 
  118. ^ Lewis Smith (2008-02-27). "Doomsday vault for world’s seeds is opened under Arctic mountain". London: The Times Online. 
  119. ^ "About the Lifeboat Foundation". The Lifeboat Foundation. Retrieved 26 April 2013. 
  120. ^ "The Future of Life Institute". Retrieved May 5, 2014. 


See also

  • GCR Institute - A think tank for all things catastrophic risk
  • Millennium Alliance for Humanity & The Biosphere - Growing a global community to advocate for sustainable practices
  • X Center - Researching how to reason about unexpected events, both small and large scale
  • WHO Global Alert and Response - Monitoring for epidemics
  • Connecting Organizations for Regional Disease Surveillance - NTI is focused on reducing the risk from Weapons of Mass Destruction, and containment of damage after the fact
  • USAID Emerging Pandemic Threats Program - A US government program seeking to prevent and contain naturally generated pandemics at their source
  • Lawrence Livermore National Laboratory – Global Security - Researching GCR for the US defense department
  • Center for International Security and Cooperation - Focusing on political cooperation to reduce CGR
  • World Institute for Nuclear Security - Focused on security and safety training for those who are involved in the nuclear industry

Global catastrophic risk reduction organizations

Existential risk reduction organizations

Some precautions that people are already taking for a cataclysmic event include:

[117] Solutions of this scope may require megascale engineering.

Precautions and prevention

The cataclysmic pole shift hypothesis was formulated in 1872. Revisited repeatedly in the second half of the 20th century, it proposes that the axis of the Earth with respect to the crust could change extremely rapidly, causing massive earthquakes, tsunamis, and damaging local climate changes. The hypothesis is contradicted by the mainstream scientific interpretation of geological data, which indicates that true polar wander does occur, but very slowly over millions of years. Sometimes this hypothesis is confused with the accepted theory of geomagnetic reversal in which the magnetic poles reverse, but which has no influence on the axial poles or the rotation of the solid earth.

The belief that the Mayan civilization's Long Count calendar ended abruptly on December 21, 2012 was a misconception due to the Mayan practice of using only five places in Long Count Calendar inscriptions. On some monuments the Mayan calculated dates far into the past and future but there is no end of the world date. There was a Piktun ending (a cycle of 13,144,000 day Bak'tuns) on December 21, 2012. A Piktun marks the end of a 1,872,000 day or approximately 5125 year period and is a significant event in the Mayan calendar. However, there is no historical or scientific evidence that the Mayans believed it would be a doomsday. Some believe it was just the beginning of another Piktun.[116]

Discredited scenarios

A solar superstorm, which is a drastic and unusual decrease or increase in the Sun's power output, could have severe consequences for life on earth. (See solar flare)

If the solar system were to pass through a dark nebula, a cloud of cosmic dust, severe global climate change would occur.[115]

A similar threat is a hypernova, produced when a hypergiant star explodes and then collapses, sending vast amounts of radiation sweeping across hundreds of lightyears. Hypernovas have never been observed; however, a hypernova may have been the cause of the Ordovician–Silurian extinction events. The nearest hypergiant is Eta Carinae, approximately 8,000 light-years distant.[113] The hazards from various astrophysical radiation sources were reviewed in 2011.[114]

Both threats are very unlikely in the foreseeable future.[112]

Another threat might come from gamma ray bursts.[111]

A number of astronomical threats have been identified. Massive objects, e.g., a star, large planet or black hole, could be catastrophic if a close encounter occurred in the solar system. In April 2008, it was announced that two simulations of long-term planetary movement, one at Paris Observatory and the other at University of California, Santa Cruz indicate a 1% chance that Mercury's orbit could be made unstable by Jupiter's gravitational pull sometime during the lifespan of the sun. Were this to happen, the simulations suggest a collision with Earth could be one of four possible outcomes (the others being Mercury colliding with the Sun, colliding with Venus, or being ejected from the solar system altogether). If Mercury were to collide with the Earth, all life on earth could be obliterated: an asteroid 15 km wide is believed to have caused the extinction of the non-avian dinosaurs, while Mercury is 5,000 km in diameter.[110]

Cosmic threats

Although evidence of alien life has never been documented, scientists such as Carl Sagan have postulated that the existence of extraterrestrial life is very likely. In 1969, the "Extra-Terrestrial Exposure Law" was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991.[108] Scientists consider such a scenario technically possible, but unlikely.[109]

Extraterrestrial life could invade Earth[107] either to exterminate and supplant human life, enslave it under a colonial system, steal the planet's resources, or destroy the planet altogether.

Extraterrestrial invasion

In 1.4 million years, the star Gliese 710 is expected to start causing an increase in the number of meteoroids in the vicinity of Earth when it passes within 1.1 light years of the Sun and perturbing the Oort cloud. Dynamic models by García-Sánchez predict a 5% increase in the rate of impact.[106] Objects perturbed from the Oort cloud take millions of years to reach the inner Solar System.

Several asteroids have collided with earth in recent geological history. The Chicxulub asteroid, for example, is theorized to have caused the extinction of the non-avian dinosaurs 66 million years ago at the end of the Cretaceous. If such an object struck Earth it could have a serious impact on civilization. It is even possible that humanity would be completely destroyed. For this to occur the asteroid would need to be at least 1 km (0.62 mi) in diameter, but probably between 3 and 10 km (2–6 miles).[105] Asteroids with a 1 km diameter have impacted the Earth on average once every 500,000 years.[105] Larger asteroids are less common. Small Near-Earth asteroids are regularly observed.

Asteroid impact

The magnetic poles of the Earth shifted many times in geologic history. The duration of such a shift is still debated. Theories exist that say that during that time, the magnetic field around the Earth would be weakened or nonexistent, threatening electrical civilization or even several species by allowing radiation from the sun, especially solar flares or cosmic background radiation to reach the surface. However, these theories have been somewhat discredited, as statistical analysis shows no evidence for a correlation between past reversals and past extinctions.[102][103]

Geomagnetic reversal

Another possibility is a megatsunami. A megatsunami could, for example, destroy the entire East Coast of the United States. The coastal areas of the entire world could also be flooded in case of the collapse of the West Antarctic Ice Sheet.[101] While none of these scenarios are likely to destroy humanity completely, they could regionally threaten civilization. There have been two recent high-fatality tsunamis—after the 2011 Tōhoku earthquake and the 2004 Indian Ocean earthquake, although they were not large enough to be considered megatsunamis. A megatsunami could have astronomical origins as well, such as an asteroid impact in an ocean.


A much more speculative concept is the Verneshot: a hypothetical volcanic eruption caused by the buildup of gas deep underneath a craton. Such an event may be forceful enough to launch an extreme amount of material from the crust and mantle into a sub-orbital trajectory.

Such an eruption could also release large amounts of gases that could alter the balance of the planet's carbon dioxide and cause a runaway greenhouse effect, or enough pyroclastic debris and other material might be thrown into the atmosphere to partially block out the sun and cause a volcanic winter, as happened in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer. Such an eruption might cause the immediate deaths of millions of people several hundred miles from the eruption, and perhaps billions of deaths[100] worldwide, due to the failure of the monsoon, resulting in major crop failures causing starvation on a massive scale.[100]

When the supervolcano at Yellowstone last erupted 640,000 years ago, the magma and ash ejected from the caldera covered most of the United States west of the Mississippi river and part of northeastern Mexico.[99] Another such eruption could threaten civilization.

A geological event such as massive flood basalt, volcanism, or the eruption of a supervolcano[95] leading to the so-called Volcanic Winter, similar to a Nuclear Winter. One such event, the Toba Eruption,[96] occurred in Indonesia about 71,500 years ago. According to the Toba catastrophe theory,[97] the event may have reduced human populations to only a few tens of thousands of individuals. Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in the past 17 million years.[98] A massive volcano eruption would produce extraordinary intake of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling (nuclear winter when in short term and ice age when in long term) or global warming (if greenhouse gases prevail)).


In the history of the Earth, twelve ice ages are known to have occurred. More ice ages will be possible at an interval of 40,000–100,000 years. An ice age would have a serious impact on civilization because vast areas of land (mainly in North America, Europe, and Asia) could become uninhabitable. It would still be possible to live in the tropical regions, but with possible loss of humidity and water. Currently, the world is existing in an interglacial period within a much older glacial event. The last glacial expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that a natural ice age will occur anytime soon.

Ice age

Climate change refers to earth's natural variations in climate over time. The climate has changed slowly such as during ice ages, and warmer periods when palm trees grew in Antarctica. It has been hypothesized that there was also a period called "snowball earth" when all the oceans were covered in a layer of ice. These climatic changes occurred slowly, prior to the rise of human civilization about 10 thousand years ago near the end of the last Major Ice Age when the climate become more stable. Since civilization originated during a period of stable climate, a natural variation into a new climate regime (colder or hotter) could pose a threat to civilization.

Climate change

The death toll for a pandemic is equal to the virulence (deadliness) of the pathogen or pathogens, multiplied by the number of people eventually infected. It has been hypothesised that there is an upper limit to the virulence of naturally evolved pathogens.[57] This is because a pathogen that quickly kills its hosts might not have enough time to spread to new ones, while one that kills its hosts more slowly or not at all will allow carriers more time to spread the infection, and thus likely out-compete a more lethal species or strain.[90] This simple model predicts that if virulence and transmission are not linked in any way, pathogens will evolve towards low virulence and rapid transmission. However, this assumption is not always valid and in more complex models, where the level of virulence and the rate of transmission are related, high levels of virulence can evolve.[91] The level of virulence that is possible is instead limited by the existence of complex populations of hosts, with different susceptibilities to infection, or by some hosts being geographically isolated.[57] The size of the host population and competition between different strains of pathogens can also alter virulence.[92] Interestingly, a pathogen that only infects humans as a secondary host and usually infects another species (a zoonosis) may have little constraint on its virulence in people, since infection here is an accidental event and its evolution is driven by events in another species.[93] There are numerous historical examples of pandemics[94] that have had a devastating effect on a large number of people, which makes the possibility of global pandemic a realistic threat to human civilization.

Global pandemic


Biotechnology could lead to the creation of a pandemic, chemical warfare could be taken to an extreme, nanotechnology could lead to grey goo in which out-of-control self-replicating robots consume all living matter on earth while building more of themselves - in both cases, either deliberately or by accident.[89]

Nick Bostrom suggested that in the pursuit of knowledge humanity might inadvertently create a device that could destroy Earth and our solar system.[81] Investigations in nuclear and high energy physics could create unusual conditions with catastrophic consequences. For example, scientists worried that the first nuclear test might ignite the atmosphere.[82][83] More recently, others worried that the RHIC[84] or the Large Hadron Collider might start a chain-reaction global disaster involving black holes or false vacuum states. These particular concerns have been refuted,[85][86][87][88] but the general concern remains.

Experimental technology accident

Wheat is humanity's 3rd most produced cereal. Extant fungal infections such as Ug99 [79] (a kind of stem rust) can cause 100% crop losses in most modern varieties. Little or no treatment is possible and infection spreads on the wind. Should the world's large grain producing areas become infected then there would be a crisis in wheat availability leading to price spikes and shortages in other food products.[80]

The authors of this study believe that the mentioned agricultural crisis will only begin to impact us after 2020, and will not become critical until 2050. Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before.[77][78]

The 20th century saw a rapid increase in human population due to medical developments and massive increase in agricultural productivity[73] made by the Green Revolution.[74] Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide population growth or actually enabled population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon fueled irrigation.[75] David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says the study.[76]

World population and agricultural crisis

An ecological disaster, such as world crop failure and collapse of ecosystem services, could be induced by the present trends of overpopulation, economic development,[70] and non-sustainable agriculture. Most of these scenarios involve one or more of the following: Holocene extinction event, scarcity of water that could lead to approximately one half of the Earth's population being without safe drinking water, pollinator decline, overfishing, massive deforestation, desertification, climate change, or massive water pollution episodes. A very recent threat in this direction is colony collapse disorder,[71] a phenomenon that might foreshadow the imminent extinction[72] of the Western honeybee. As the bee plays a vital role in pollination, its extinction would severely disrupt the food chain.

Ecological disaster

Using scenario analysis, the Global Scenario Group (GSG), a coalition of international scientists convened by Paul Raskin, developed a series of possible futures for the world as it enters a Planetary Phase of Civilization. One scenario involves the complete breakdown of civilization as the effects of global warming become more pronounced, competition for scarce resources increases, and the rift between the poor and the wealthy widens. The GSG’s other scenarios, such as Policy Reform, Eco-Communalism, and Great Transition avoid this societal collapse and eventually result in environmental and social sustainability. They claim the outcome is dependent on human choice[68] and the possible formation of a global citizens movement which could influence the trajectory of global development.[69]

It has been suggested that runaway global warming (runaway climate change) might cause Earth to become searing hot like Venus. In less extreme scenarios it could cause the end of civilization, as we know it.[67]

Global warming refers to the warming caused by human technology since at least the 19th century. Global warming reflects abnormal variations to the expected climate within the Earth's atmosphere and subsequent effects on other parts of the Earth. Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters. Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, and increased spread of infectious diseases such as malaria.

Global warming

Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating such a large amount of nuclear weaponry would have a long term effect on the climate causing cold weather and reduced sunlight[65] that may generate significant upheaval in advanced civilizations.[66]

While popular perception sometimes takes nuclear war as "the end of the world", experts assign low probability to human extinction from nuclear war.[63][64] In 1982, Brian Martin estimated that a US-Soviet Union exchange might kill 400-450 million directly and maybe several hundred million more through follow-up consequences.[63]

The scenarios that have been explored most frequently are nuclear warfare and doomsday devices. Although the probability of a nuclear war per year is slim, Professor Martin Hellman, described it as inevitable on the long run; unless the probability approaches zero, inevitably there will come a day when civilization's luck runs out.[59] During the Cuban missile crisis, president Kennedy estimated the odds of nuclear war as being "somewhere between one out of three and even".[60] The United States and Russia have a combined arsenal of 15,315 nuclear weapons[61], and there are an estimated total of 16,400 nuclear weapons in existence worldwide.[62]

Warfare and mass destruction

Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks and developing facilities to mitigate disease outbreaks (e.g. better and/or more widely distributed vaccines).[56]

Given current development, more risk from novel, engineered pathogens is to be expected in the future.[56] It has been hypothesized that there is an upper bound on the virulence (deadliness) of naturally occurring pathogens.[57] But pathogens may be intentionally or unintentionally genetically modified to change virulence and other characteristics.[56] A group of Australian researchers e.g. unintentionally changed characteristics of the mousepox virus while trying to develop a virus to sterilize rodents.[56] The modified virus became highly lethal even in vaccinated and naturally resistant mice.[44][58] The technological means to genetically modify virus characteristics are likely to become more widely available in the future if not properly regulated.[56]

Exponential growth has been observed in the biotechnology sector and Noun and Chyba predict that this will lead to major increases in biotechnological capabilities in the coming decades.[56] They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control (especially as the technological capabilities are becoming available even to individual users).[56]

Biotechnology can pose a global catastrophic risk in the form of natural pathogens or novel, engineered ones. Such a catastrophe may be brought about by usage in warfare, terrorist attacks or by accident.[56] Terrorist applications of biotechnology have historically been infrequent.[56] To what extent this is due to a lack of capabilities or motivation is not resolved.[56]


A grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation[52] and has been a theme in mainstream media and fiction.[53][54] This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nanotech experts including Drexler now discredit the scenario. According to Chris Phoenix a "So-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident".[55]

Since self-regulation by all state and non-state actors seems hard to achieve,[47] measures to mitigate war-related risks have mainly been proposed in the area of international cooperation.[41][48] International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control.[49] International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed.[48] One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour.[41] The Center for Responsible Nanotechnology also suggest some technical restrictions.[50] Improved transparency regarding technological capabilities may be another important facilitator for arms-control.[51]

Several researchers state that the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.[41][43][44] Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races): (1) A large number of players may be tempted to enter the race since the threshold for doing so is low;[41] (2) the ability to make weapons with molecular manufacturing will be cheap and easy to hide;[41] (3) therefore lack of insight into the other parties' capabilities can tempt players to arm out of caution or to launch preemptive strikes;[41][45] (4) molecular manufacturing may reduce dependency on international trade,[41] a potential peace-promoting factor;[46] (5) wars of aggression may pose a smaller economic threat to the aggressor since manufacturing is cheap and humans may not be needed on the battlefield.[41]

Phoenix and Tremer classify catastrophic risks posed by nanotechnology into three categories: (1) from augmenting the development of other technologies such as AI and biotechnology; (2) by enabling mass-production of potentially dangerous products that cause risk dynamics (such as arms races) depending on how they are used; (3) from uncontrolled self-perpetuating processes with destructive effects. At the same time, nanotechnology may be used to alleviate several other global catastrophic risks.[41]

Molecular manufacturing could be used to cheaply produce, among many other products, highly advanced, durable weapons.[41] Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.[41]

Many nanoscale technologies are in development or currently in use.[41] The only one that appears to pose a significant global catastrophic risk is molecular manufacturing, a technique that would make it possible to build complex structures at atomic precision.[42] Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories weighing a kilogram or more.[41][42] When nanofactories gain the ability to produce other nanofactories production may only be limited by relatively abundant factors such as input materials, energy and software.[41]


In PBS's Off Book, Gary Marcus asks "what happens if (AIs) decide we are not useful anymore?" Marcus argues that AI cannot, and should not, be banned, and that "the sensible thing to do" is to "start thinking now" about AI ethics.[40]

On the other hand, a "friendly" AI could help reduce existential risk by developing technological solutions to threats.[34]

Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[35] There are also concerns about technology which might allow some armed robots to be controlled mainly by other robots.[36] The US Navy has funded a report which indicates that as military robots become more complex, there should be greater attention to implications of their ability to make autonomous decisions.[37][38] One researcher states that autonomous robots might be more humane, as they could make decisions more effectively. However, other experts question this.[39]

In 2009, experts attended a conference hosted by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence." They noted that self-awareness as depicted in science-fiction is probably unlikely, but that there were other potential hazards and pitfalls.[29] Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.[31][32][33] Eliezer Yudkowsky believes that risks from artificial intelligence are harder to predict than any other known risks. He also argues that research into artificial intelligence is biased by anthropomorphism. Since people base their judgments of artificial intelligence on their own experience, he claims that they underestimate the potential power of AI. He distinguishes between risks due to technical failure of AI, which means that flawed algorithms prevent the AI from carrying out its intended goals, and philosophical failure, which means that the AI is programmed to realize a flawed ideology.[34]

Vernor Vinge has suggested that a moment may come when computers and robots are smarter than humans. He calls this "the Singularity."[29] He suggests that it may be somewhat or possibly very dangerous for humans.[30] This is discussed by a philosophy called Singularitarianism.

[28] It could eliminate, wiping out if it chose, any other challenging rival intellects; alternatively it might manipulate or persuade them to change their behavior towards its own interests, or it may merely obstruct their attempts at interference.[28]

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.