词条 | Human extinction |
释义 |
In futures studies, human extinction is the hypothetical end of the human species. This may result from natural causes or it may be the result of human action. The likelihood of human extinction in the future by wholly natural scenarios, such as a meteorite impact or large-scale volcanism, is generally considered to be extremely low.[1] For anthropogenic extinction, many possible scenarios have been proposed: human global nuclear annihilation, biological warfare or the release of a pandemic-causing agent, overpopulation,[2] ecological collapse, and climate change; in addition, emerging technologies could bring about new extinction scenarios, such as advanced artificial intelligence, biotechnology or self-replicating nanobots. The probability of anthropogenic human extinction within the next hundred years is the topic of an active debate. Human extinction needs to be differentiated from the extinction of all life on Earth (see also future of Earth) and from the extinction of major components of human culture (e.g., through a global catastrophe leaving only small, scattered human populations, which might evolve in isolation). Moral importance of existential risk"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress.[1] Many scholars make an argument based on the size of the "cosmic endowment" and state that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:
Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 10{{sup|16}} (or 10,000,000,000,000,000) human lives of normal duration.[4]{{rp|453–4}} Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 10{{sup|34}} biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 10{{sup|54}} cybernetic human life-years.[1] Possible scenarios{{See also|Global catastrophic risk|:Category:Doomsday scenarios}}Severe forms of known or recorded disasters
Habitat threats
Scientific accidentsWithout regulation, scientific advancement has a potential to risk human extinction as a result of the effects or use of totally new technologies. Some scenarios include:
Further scenarios of extraterrestrial origin(Major impact events.[28] and Gamma-ray burst in our part of the Milky Way[29][30] were already mentioned above.)
Evolution of a posthuman species{{Main|Posthumanism|Transhumanism}}Normal biological evolution of humanity will continue and change humans over geological time scales. Although this could, in a non-phylogenetic taxonomy, be considered to give rise to a new species, such an ongoing evolution would biologically not be considered a species extinction. Given the likelihood that significant genetic exchange between human populations will continue, it is highly unlikely that humans will split into multiple species through natural evolution. Some scenarios envision that humans could use genetic engineering or technological modifications to split into normal humans and a new species – posthumans.[31][32][33][34][35][36][37] Such a species could be fundamentally different from any previous life form on Earth, e.g. by merging humans with technological systems.[38][39] Such scenarios do harbor a risk of the extinction of the "old" human species by means of the new, posthuman entity. Extinction through devolutionHumans are doing so well that there's no more survival of the fittest.[40] This was highly debated in the 19th century in the form of devolution and degeneration. Perception of and reactions to human extinction riskProbability estimatesBecause human extinction is unprecedented, speculation about the probability of different scenarios is highly subjective. Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure.[1][41] A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. The 2006 Stern Review for the UK Treasury assumes the 100-year probability of human extinction is 10% in its economic calculations.[41] Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for a long isolation.[42] In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war.[41] Any number of events could lead to a massive loss of human life; but if the last few (but see minimum viable population), most resilient, humans are unlikely to also die off, then that particular human extinction scenario is not credible.[43] Psychology{{See also|Existentialism#Angst and dread}}Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:[1][44]Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive". All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.[45] Some behavioural finance scholars claim that recent evidence is given undue significance in risk analysis. Roughly speaking, "100-year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented.{{Ref|unprecedented}} Research and initiativesEven though the importance and potential impact of research on existential risks is often highlighted, relatively few research efforts are being made in this field. In 2001 Bostrom stated:[46] {{cquote|There is more scholarly work on the life-habits of the dung fly than on existential risks [to humanity].}}Although existential risks are less manageable by individuals than, e.g., health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."[47] Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute (est. 2011). OmnicideOmnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare,[48][49][50] but it can also apply to extinction through means such as a global anthropogenic ecological catastrophe.[51] Omnicide can be considered a subcategory of genocide.[52] Using the concept in this way, one can argue, for example, that:{{cquote|The arms race is genocidal in intent given the fact that the United States and the Soviet Union are knowingly preparing to destroy each other as viable national and political groups.[53]}} However, the Cold War arms race differed from genocide in that the weapons involved were not used, instead serving as nuclear deterrence. Proposed countermeasures{{Further|Global catastrophic risk#Precautions and prevention}}Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.[54][55]More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster.[41][42] Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.[41][56] In popular culture{{Main|Apocalyptic and post-apocalyptic fiction}}Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[57][58] A threat of human extinction, such as through a technological singularity (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide.[59] Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.[60] See also
Notes{{Refbegin}} {{note|unprecedented}} For research on this, see Psychological science volume 15 (2004): Decisions From Experience and the Effect of Rare Events in Risky Choice. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by Kahneman in "prospect theory" (in their original experiments the likelihood of rare events is overestimated). However, further analysis of the bias has shown that both forms occur: When judging from description people tend to overestimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader overestimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically underestimate its likelihood. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth." (Is Humanity Suicidal? The New York Times Magazine 30 May 1993).{{note|PlanForDestruction}} ReligiousTolerance.org says that Aum Supreme Truth is the only religion known to have planned Armageddon for non-believers. Their intention to unleash deadly viruses is covered in Our Final Hour, and by Aum watcher, Akihiko Misawa. The Gaia Liberation Front advocates (but is not known to have active plans for) total human genocide, see: GLF, A Modest Proposal. Leslie, 1996 says that Aum's collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.{{note|Unbiased}} Leslie (1996) discusses the survivorship bias (which he calls an "observational selection" effect on page 139) he says that the a priori certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe." (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, Acta Pysica Polonica B, May 1989).{{note|2400}} For the "West Germany" extrapolation see: Leslie, 1996 (The End of the World) in the "War, Pollution, and disease" chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinction scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.{{note|AlienConquestUnlikely}} Former NASA consultant David Brin's lengthy rebuttal to SETI enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." (See full text at SETIleague.org.){{Refend}} References1. ^1 2 3 4 5 Bostrom, Nick. "Existential risk prevention as global priority". Global Policy 4.1 (2013): 15-31. 2. ^{{cite news |author=Niall Firth |date=18 June 2010 |title=Human race 'will be extinct within 100 years', claims leading scientist |newspaper=Daily Mail |url=http://www.dailymail.co.uk/sciencetech/article-1287643/Human-race-extinct-100-years-population-explosion.html |accessdate=28 January 2014}} 3. ^{{cite journal | last1=Sagan | first1=Carl | date=1983 | title=Nuclear war and climatic catastrophe: Some policy implications | journal=Foreign Affairs | volume=62 | issue=2 | pages=257–292 | doi=10.2307/20041818 |url=https://www.researchgate.net/publication/246982634| jstor=20041818 }} 4. ^1 Parfit, D. (1984) Reasons and Persons. Oxford: Clarendon Press. 5. ^{{cite journal|last1=Adams|first1=Robert Merrihew|title=Should Ethics be More Impersonal? a Critical Notice of Derek Parfit, Reasons and Persons|journal=The Philosophical Review|date=October 1989|volume=98|issue=4|pages=439–484|doi=10.2307/2185115|jstor=2185115}} 6. ^1 2 {{cite journal |first=Nick |last=Bostrom |title=Existential Risks |journal=Journal of Evolution and Technology |volume=9 |date=March 2002 |url=http://www.jetpress.org/volume9/risks.html |accessdate=28 January 2014}} 7. ^{{cite journal |author=Anders Sandberg |author2=Milan M. Ćirković |date=9 September 2008 |title=How can we reduce the risk of human extinction?|journal=Bulletin of the Atomic Scientists |url=http://thebulletin.org/how-can-we-reduce-risk-human-extinction |accessdate=28 January 2014}} 8. ^{{cite news|url=http://www.nti.org/gsn/article/top-us-disease-fighters-warn-of-new-engineered-pathogens-but-call-bioweapons-doomsday-unlikely/|title=Top U.S. Disease Fighters Warn of New Engineered Pathogens but Call Bioweapons Doomsday Unlikely|last=Fiorill|first=Joe|date=29 July 2005|accessdate=10 September 2013|newspaper=Global Security Newswire}} 9. ^http://www.desdemonadespair.net/2015/06/graph-of-day-carbon-emissions-and-human.html 10. ^{{Cite journal|title = Quaternary climatic instability in south-east Australia from a multi-proxy speleothem record |last = Barker|first = P. A.|date = 2014|journal = Journal of Quaternary Science|doi = 10.1002/jqs.2734|pmid = |volume=29|issue = 6|pages=589–596|bibcode =2014JQS....29..589W}} 11. ^{{Cite journal|date=2013|title=The Near-Earth Objects and Their Potential Threat To Our Planet|url=https://www.researchgate.net/publication/260829589|journal=Astron Astrophys Rev|doi=|pmid=|access-date=|author1=Perna . D|author2=Barucci M.A|author3=Fulchignoni .M}} 12. ^{{Cite journal|last=Alvarez|first=Luis W.|date=January 1983|title=Experimental evidence that an asteroid impact led to the extinction of many species 65 million years ago|journal=Proc. Natl. Acad. Sci. U.S.A.|volume=80|issue=2|pages=627–42|doi=10.1073/pnas.80.2.627|pmc=393431|pmid=16593274|bibcode=1983PNAS...80..627A}} 13. ^{{cite journal | last1 = Bierwirth | first1 = P.N. | doi = 10.13140/RG.2.2.16787.48168 | title = Carbon dioxide toxicity and climate change: a serious unapprehended risk for human health. | year = 2016 | url = * http://grapevine.com.au/%7Epbierwirth/co2toxicity.pdf | accessdate = }} 14. ^Vitousek, P. M., H. A. Mooney, J. Lubchenco, and J. M. Melillo. 1997. Human Domination of Earth's Ecosystems. Science 277 (5325): 494–499; Pimm, S. L. 2001. The World According to Pimm: a Scientist Audits the Earth. McGraw-Hill, NY; The Guardian. 2005. Earth is All Out of New Farmland. 7 December 2005. 15. ^{{cite web|url=http://www.biologicaldiversity.org/programs/population_and_sustainability/extinction/|title=Human Population Growth and Extinction|author=|date=|website=www.biologicaldiversity.org|accessdate=30 March 2018}} 16. ^[https://www.bbc.co.uk/news/technology-19923200 Can we be sure the world's population will stop rising?], BBC News, 13 October 2012 17. ^The best stats you've ever seen, Hans Rosling, TED Talks, Filmed Feb 2006 18. ^{{cite book|url=https://books.google.com/books?id=LTa33mfbNOgC&pg=PT181|title=Energy for a Sustainable World: From the Oil Age to a Sun-Powered Future|last2=Armaroli|first2=Nicola|publisher=John Wiley & Sons|year=2010|isbn=978-3-527-63361-6|page=181|first1=Vincenzo|last1=Balzani}} 19. ^{{cite news|url=http://news.bbc.co.uk/2/hi/sci/tech/specials/washington_2000/649913.stm|title=Date set for desert Earth|date=21 February 2000|publisher=BBC News|author=Damian Carrington|accessdate=28 January 2014}} 20. ^{{cite web|url=http://www.space.com/5016-earth-final-sunset-predicted.html|title=Earth's Final Sunset Predicted|date=26 February 2008|publisher=space.com|author=Clara Moskowitz|accessdate=28 January 2014}} 21. ^{{Cite journal|last2=Connon Smith|first2=R.|year=2008|title=Distant future of the Sun and Earth revisited|journal=Monthly Notices of the Royal Astronomical Society|volume=386|issue=1|pages=155–163|arxiv=0801.4031|bibcode=2008MNRAS.386..155S|doi=10.1111/j.1365-2966.2008.13022.x|pmc=|pmid=|last1=Schröder|first1=K. -P.}} 22. ^{{cite journal |last1= Chalmers|first1= David|date= 2010|title= The singularity: A philosophical analysis|journal= Journal of Consciousness Studies|volume= 17|issue= |pages= 9–10|accessdate=17 August 2013|url=http://consc.net/papers/singularity.pdf}} 23. ^1 Martin Rees (2004). OUR FINAL HOUR: A Scientist's warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century — On Earth and Beyond. {{ISBN|0-465-06863-4}} 24. ^Bostrom 2002, section 4.8 25. ^{{cite web|url=https://www.newscientist.com/article/mg16322014.700-a-black-hole-ate-my-planet.html|title=A black hole ate my planet|first=Robert|last=Matthews|publisher=New Scientist|date=28 August 1999}} 26. ^"Statement by the Executive Committee of the DPF on the Safety of Collisions at the Large Hadron Collider." {{webarchive|url=https://web.archive.org/web/20091024184048/http://www.aps.org/units/dpf/governance/reports/upload/lhc_saftey_statement.pdf |date=24 October 2009 }} 27. ^{{cite techreport |last=Konopinski |first=E. J |author2=C. Marvin |author3=Edward Teller |title=Ignition of the Atmosphere with Nuclear Bombs |number=LA-602 | institution=Los Alamos National Laboratory |url=https://fas.org/sgp/othergov/doe/lanl/docs1/00329010.pdf |format=PDF |date=1946|type=declassified Feb. 1973 |accessdate=23 November 2008}} 28. ^1 Bostrom 2002, section 4.10 29. ^1 {{cite news|last1=Kluger|first1=Jeffrey|title=The Super-Duper, Planet-Frying, Exploding Star That's Not Going to Hurt Us, So Please Stop Worrying About It|url=http://science.time.com/2012/12/21/the-super-duper-planet-frying-exploding-star-thats-not-going-to-hurt-us-so-please-stop-worrying-about-it/|accessdate=20 December 2015|publisher=Time Magazine|date=21 December 2012}} 30. ^1 {{cite news|last1=Tuthill|first1=Peter|title=WR 104: Technical Questions|url=http://www.physics.usyd.edu.au/~gekko/pinwheel/tech_faq.html|accessdate=20 December 2015}} 31. ^{{cite web|url=https://www.technologyreview.com/s/425804/emtech-get-ready-for-a-new-human-species/|title=EmTech: Get Ready for a New Human Species|accessdate=1 July 2016}} 32. ^{{cite book|url=https://books.google.com/books?id=s5vWCgAAQBAJ&pg=PA188|title=Thomas Aquinas : teacher of humanity : proceedings from the first conference of the Pontifical Academy of St. Thomas Aquinas held in the United States of America|isbn=978-1443875547|accessdate=1 July 2016|last1=Hittinger|first1=John|date=2015-10-05}} 33. ^{{cite book|url=https://books.google.com/books?id=1hYPZCZZUIEC&pg=PA143|title=Perspectives on Health and Human Rights|accessdate=1 July 2016|isbn=9780415948067|last1=Gruskin|first1=Sofia|last2=Annas|first2=George J.|last3=Grodin|first3=Michael A.|year=2005}} 34. ^{{cite book|url=https://books.google.com/books?id=ulE-5xLdM5IC&pg=PA8|title=Posthuman Suffering and the Technological Embrace|last1=Miccoli|first1=Anthony|accessdate=1 July 2016|isbn=9780739126332|year=2010}} 35. ^{{cite web|url=https://www.npr.org/sections/13.7/2014/06/11/320961912/the-transhuman-future-be-more-than-you-can-be|title=The Transhuman Future: Be More Than You Can Be|accessdate=1 July 2016}} 36. ^{{cite web|url=http://www.dld-conference.com/articles/will-you-join-the-transhuman-evolution|title=WILL YOU JOIN THE TRANSHUMAN EVOLUTION?|accessdate=1 July 2016}} 37. ^{{cite web|url=http://www.techinsider.io/rob-nail-says-humans-are-turning-into-a-new-species-2016-6|title=How humans are turning into a 'totally different species'|accessdate=1 July 2016}} 38. ^{{cite book|title=I, Cyborg|last=Warwick|first=Kevin|date=2004|publisher=University of Illinois Press|isbn=978-0-252-07215-4|authorlink=Kevin Warwick}} 39. ^{{cite web|url=http://www.dailymail.co.uk/sciencetech/article-3423063/Is-technology-causing-evolve-new-SPECIES-Expert-believes-super-humans-called-Homo-optimus-talk-machines-digitally-immortal-2050.html|title=Is technology causing us to 'evolve' into a new SPECIES? Expert believes super humans called Homo optimus will talk to machines and be 'digitally immortal' by 2050|publisher=DailyMail Online|last1=Griffiths|first1=Sarah|accessdate=1 July 2016}} 40. ^{{Cite journal|last=Lynch|first=Michael|date=1 March 2016|title=Mutation and Human Exceptionalism: Our Future Genetic Load |journal=Genetics|volume=202|issue=3|pages=869–875|doi=10.1534/genetics.115.180471 |pmc=4788123|pmid=26953265}} 41. ^1 2 3 4 Matheny, Jason G. "Reducing the risk of human extinction". Risk Analysis 27.5 (2007): 1335-1344. 42. ^1 Wells, Willard. Apocalypse when?. Praxis, 2009. {{ISBN|978-0387098364}} 43. ^Tonn, Bruce, and Donald MacGregor. "A singular chain of events". Futures 41.10 (2009): 706-714. 44. ^Yudkowsky, Eliezer. "[https://intelligence.org/files/CognitiveBiases.pdf Cognitive biases potentially affecting judgment of global risks]". Global catastrophic risks 1 (2008): 86. p.114 45. ^{{cite web|title=We're Underestimating the Risk of Human Extinction|url=https://www.theatlantic.com/technology/archive/2012/03/were-underestimating-the-risk-of-human-extinction/253821/|publisher=The Atlantic|accessdate=1 July 2016}} 46. ^{{cite web|url=http://www.nickbostrom.com/existential/risks.pdf|title=Existential Risks - Analyzing Human Extinction Scenarios and Related Hazards|last = Bostrom|first = Nick|date = 2001}} 47. ^"Practical application" page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology {{webarchive|url=https://web.archive.org/web/20050512134626/http://www.princeton.edu/~jknobe/physics.pdf |date=12 May 2005 }} 48. ^{{cite book |author=Rose Somerville |author2=John Somerville, introduction |date=1981 |title=Soviet Marxism and nuclear war : an international debate : from the proceedings of the special colloquium of the XVth World Congress of Philosophy |publisher=Greenwood Press |page=151 |isbn=978-0-313-22531-4}} 49. ^{{cite book |last=Goodman |first=Lisl Marburg |author2=Lee Ann Hoff |date=1990 |title=Omnicide: The Nuclear Dilemma |location=New York |publisher=Praeger |isbn=978-0-275-93298-5}} 50. ^{{cite book |editor=Daniel Landes |date=1991 |title=Confronting Omnicide: Jewish Reflections on Weapons of Mass Destruction |publisher=Jason Aronson, Inc |isbn=978-0-87668-851-9}} 51. ^Wilcox, Richard Brian. 2004. [https://www.researchgate.net/publication/265576395_The_Ecology_of_Hope_Environmental_Grassroots_Activism_in_Japan The Ecology of Hope: Environmental Grassroots Activism in Japan]. Ph.D. Dissertation, Union Institute & University, College of Graduate Studies. Page 55. 52. ^{{cite journal | doi = 10.1177/0967010606064141 | last1 = Jones | first1 = Adam | date = 2006 | title = A Seminal Work on Genocide | url = | journal = Security Dialogue | volume = 37 | issue = 1| pages = 143–144 }} 53. ^{{cite journal | last1 = Santoni | first1 = Ronald E. | date = 1987 | title = Genocide, Nuclear Omnicide, and Individual Responsibility | url = | journal = Social Science Record | volume = 24 | issue = 2| pages = 38–41 }} 54. ^{{cite web|last1=Malik|first1=Tariq|title=Stephen Hawking: Humanity Must Colonize Space to Survive|url=http://www.space.com/20657-stephen-hawking-humanity-survival-space.html|accessdate=1 July 2016}} 55. ^{{cite web|last1=Shukman|first1=David|title=Hawking: Humans at risk of lethal 'own goal'|url=https://www.bbc.com/news/science-environment-35344664|publisher=BBC|accessdate=1 July 2016}} 56. ^Hanson, Robin. "[https://mason.gmu.edu/~rhanson/collapse.pdf Catastrophe, social collapse, and human extinction]". Global catastrophic risks 1 (2008): 357. 57. ^{{cite news|title=He imagines a world without people. But why?|url=http://archive.boston.com/news/globe/living/articles/2007/08/18/he_imagines_a_world_without_people_but_why/|accessdate=20 July 2016|work=The Boston Globe|date=18 August 2007}} 58. ^{{cite news|last1=Tucker|first1=Neely|title=Depopulation Boom|url=https://www.washingtonpost.com/wp-dyn/content/article/2008/03/07/AR2008030703256.html?hpid=artslot|accessdate=20 July 2016|work=The Washington Post|date=8 March 2008}} 59. ^{{cite book|last1=Barcella|first1=Laura|title=The end: 50 apocalyptic visions from pop culture that you should know about -- before it's too late|date=2012|publisher=Zest Books|location=San Francisco, CA|isbn=978-0982732250}} 60. ^{{cite book|last1=Dinello|first1=Daniel|title=Technophobia!: science fiction visions of posthuman technology|date=2005|publisher=University of Texas press|location=Austin|isbn=978-0-292-70986-7|edition=1st}} Further reading
2 : Extinction|Human extinction |
随便看 |
|
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。