Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards
From: Nick Bostrom (nick@nickbostrom.com)
Date: Sat May 12 2001 - 01:50:18 MDT
I now have a presentable version of the paper I presented at the most
recent Foresight gathering. It's available in two formats:
http://www.nickbostrom.com/existential/risks.html
http://www.nickbostrom.com/existential/risks.doc
Footnotes and formatting are nicer in the M$-version.
(Spoiler-warning to those planning to attend the Transvision conference
in Berlin, where I'll give this paper again.)
ABSTRACT
Because of accelerating technological progress, humankind may be
rapidly approaching a critical phase in its career. The prospects of
nanotech systems and machine intelligence present us with unprecedented
opportunities and risks. Our future, and whether we will have a future at
all, may well be determined by how these radically transforming
technologies are managed. A better understanding of the transition
dynamics from a human to a “posthuman” society is needed in order to plot
approaches that maximize the probability of a favorable outcome. Of
particular importance is to know where the pitfalls are: the ways in
which things could go disastrously wrong. While we have had long exposure
to various personal, local, or endurable global hazards, this paper
analyzes a rather recently emerging category: that of existential
risks. These are threats that endanger the survival of intelligent
life or that could ruin the potential of human civilization for all time
to come. Some of these threats are relatively well known while others
(including some of the gravest) have gone almost unrecognized.
Existential risks have a cluster of features that make ordinary risk
management ineffective. A clearer understanding of the threat picture
will enable us to formulate better strategies, and several implications
for policy and ethics are discussed.
Nick Bostrom
Department of Philosophy
Yale University
Homepage:
http://www.nickbostrom.com
This archive was generated by hypermail 2.1.5
: Sat Nov 02 2002 - 08:07:37 MST