Existential Risks and Artificial Intelligence - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Existential Risks and Artificial Intelligence

Description:

Global warming by 0.01 C . Scope. Trans-generational. Drastic loss of biodiversity ... Runaway global warming. Supervolcano eruptions. Physics disasters ... – PowerPoint PPT presentation

Number of Views:202
Avg rating:3.0/5.0
Slides: 32
Provided by: nickbo3
Category:

less

Transcript and Presenter's Notes

Title: Existential Risks and Artificial Intelligence


1
Existential Risksand Artificial Intelligence
Nick BostromDirector, Future of Humanity
InstituteOxford University
2
Risk
  • Scope
  • Intensity
  • Probability

3
Scope
(Cosmic?)
Trans-generational
Global
Local
Personal
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
4
Scope
(Cosmic?)
Loss of one species of beetle
Trans-generational
Global warming by 0.01 Cº
Global
Congestion from one extra vehicle
Recession in a country
Local
Genocide
Fatal car crash
Loss of one hair
Personal
Car is stolen
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
5
Scope
(Cosmic?)
Drastic loss of biodiversity
Loss of one species of beetle
Trans-generational
Global warming by 0.01 Cº
Thinning of ozone layer
?
Global
Congestion from one extra vehicle
Recession in a country
Local
Genocide
Fatal car crash
Loss of one hair
Personal
Car is stolen
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
6
Scope
(Cosmic?)
Drastic loss of biodiversity
Loss of one species of beetle
Trans-generational
Global warming by 0.01 Cº
Thinning of ozone layer
Aging
Global
Congestion from one extra vehicle
Recession in a country
Local
Genocide
Fatal car crash
Loss of one hair
Personal
Car is stolen
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
7
Scope
(Cosmic?)
Drastic loss of biodiversity
Loss of one species of beetle
Trans-generational
Global warming by 0.01 Cº
Thinning of ozone layer
Aging
Global
Global catastrophic risks
Congestion from one extra vehicle
Recession in a country
Local
Genocide
Fatal car crash
Loss of one hair
Personal
Car is stolen
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
8
Scope
(Cosmic?)
Drastic loss of biodiversity
Loss of one species of beetle
?
Trans-generational
Existential risks
Global warming by 0.01 Cº
Thinning of ozone layer
Aging
Global
Global catastrophic risks
Congestion from one extra vehicle
Recession in a country
Local
Genocide
Fatal car crash
Loss of one hair
Personal
Car is stolen
Intensity
(Hellish?)
Imperceptible
Endurable
Terminal
9
Existential risk
  • One where an adverse outcome would either
    annihilate Earth-originating intelligent life or
    permanently and drastically curtail its
    potential.

(2002) Existential Risks Analyzing Human
Extinction Scenarios. J. Evol. Tech., Vol. 9.
10
Our experience
  • Dangerous animals, hostile tribes and
    individuals, poisonous foods, automobile
    accidents, Chernobyl, Bhopal, volcano eruptions,
    earthquakes, droughts, tsunamis, wars, epidemics
    of influenza, smallpox, black plague, and AIDS.

11
Our experience
  • Dangerous animals, hostile tribes and
    individuals, poisonous foods, automobile
    accidents, Chernobyl, Bhopal, volcano eruptions,
    earthquakes, droughts, tsunamis, wars, epidemics
    of influenza, smallpox, black plague, and AIDS.
  • These types of disaster have occurred many times
    throughout history.

12
Our experience
  • Dangerous animals, hostile tribes and
    individuals, poisonous foods, automobile
    accidents, Chernobyl, Bhopal, volcano eruptions,
    earthquakes, droughts, tsunamis, wars, epidemics
    of influenza, smallpox, black plague, and AIDS.
  • These types of disaster have occurred many times
    throughout history.
  • Our attitudes towards risk have been shaped by
    trial and error as we have been trying to cope
    with such risks.

13
Our experience
  • Dangerous animals, hostile tribes and
    individuals, poisonous foods, automobile
    accidents, Chernobyl, Bhopal, volcano eruptions,
    earthquakes, droughts, tsunamis, wars, epidemics
    of influenza, smallpox, black plague, and AIDS.
  • These types of disaster have occurred many times
    throughout history.
  • Our attitudes towards risk have been shaped by
    trial and error as we have been trying to cope
    with such risks.
  • Even the worst of those catastrophes were mere
    ripples on the surface of the great sea of life.

14
Some recent opinions
  • 50. Professor Sir Martin Rees, President of
    the Royal Society
  • 30 (for the next five centuries). Professor
    John Leslie
  • Significant. Judge Richard Posner
  • Not less than 25. Dr. Nick Bostrom
  • Some others who are concerned
  • Bill Joy, Erik Drexler, Eliezer Yudkowsky,

15
Anthropogenic vs. Non-anthropogenic risks
  • Anthropogenic originates from human activity
  • Non-anthropogenic all the rest

16
Anthropogenic vs. Non-anthropogenic risks
  • Anthropogenic originates from human activity
  • Non-anthropogenic all the rest
  • The real issue is anthropogenic existential risk

17
Types of existential risk
  • Bangs Earth-originating intelligent life goes
    extinct in relatively sudden disaster.
  • Crunches Humanitys potential to develop into
    posthumanity is permanently lost, although human
    life continues in some form.
  • Shrieks A limited form of posthumanity is
    durably attained, but it is an extremely narrow
    band of what is possible and desirable.
  • Whimpers A posthuman civilization is temporarily
    attained but it evolves in a direction that leads
    gradually to either the complete disappearance of
    things we value or to a state where those things
    are realized to only a minuscule degree of what
    could have been achieved.

18
Bangs
  • Nanotechnological weapons system
  • Badly programmed superintelligence
  • We are living in a simulation and it gets shut
    down
  • Nuclear holocaust
  • Biological weapon
  • Nanotechnology non-weapons accident
  • Natural pandemic
  • Runaway global warming
  • Supervolcano eruptions
  • Physics disasters
  • Impact hazards (asteroid and comets)
  • Space radiation (solar flares, supernovae, black
    hole explosions or mergers, gamma-ray bursts,
    galactic center outbursts, etc.

(2003) Are You Living In A Computer
Simulation? Phil. Quart., Vol. 53, No. 211, pp.
243-255.
N. Bostrom M. Tegmark (2005) How Unlikely is
a Doomsday Catastrophe?" Nature, Vol. 438, No.
7069.
19
Crunches
  • Resource depletion or ecological destruction
  • Misguided world government or another static
    social equilibrium stops technological progress
  • Dysgenic pressures
  • Technological arrest
  • Social collapse

(2004) The Future of Human Evolution in Death
and Anti-Death, ed. Charles Tandy (Ria University
Press Palo Alto, California, 2004), pp. 339-371.
20
Shrieks
  • Flawed superintelligence
  • Repressive totalitarian global regime
  • Take-over by a transcending upload

21
Whimpers
  • Our potential or even our core values are eroded
    by evolutionary development and/or
    self-modification
  • Killed by an extraterrestrial civilization
  • Loss of human fertility/escapism

22
Biases galore?
Is it more likely that the word starts with an R
("rope"), or that R is its third letter ("park")?
  • Good story bias? (availability heuristic)
  • Scope neglect
  • Calibration and overconfidence problems
  • Bystander apathy

(2,000 / 20,000 / 200,000) migrating birds die
each year by drowning in uncovered oil ponds,
which the birds mistake for bodies of water.
These deaths could be prevented by covering the
oil ponds with nets. How much money would you be
willing to pay to provide the needed
nets? Result 80 for the 2,000-bird group, 78
for 20,000 birds, and 88 for 200,000 birds.
(Desvousges et. al. 1993.)
Alpert and Raiffa (1982) asked subjects a
collective total of 1000 general-knowledge
questions like those described above 426 of the
true values lay outside the subjects 98
confidence intervals. Events to which subjects
assigned a probability of 2 happened 42.6 of
the time.
23
Which difference is largest?
  • A Disaster avoided
  • B Disaster 99 of humanity killed
  • C Disaster 100 of humanity killed
  • Is the difference in badness between A and B
    greater than the difference between B and C?

24
Which difference is largest?
  • A Disaster avoided
  • B Disaster 99 of humanity killed
  • C Disaster 100 of humanity killed
  • Is the difference in badness between A and B
    greater than the difference between B and C?
  • If yes, then 1 percentage point reduction of ER
    worth circa 60 million lives.

25
Which difference is largest?
  • A Disaster avoided
  • B Disaster 99 of humanity killed
  • C Disaster 100 of humanity killed
  • Is the difference in badness between A and B
    greater than the difference between B and C?
  • If yes, then 1 percentage point reduction of ER
    worth circa 60 million lives.
  • If no,

26
Opportunity costs in technological delays
  • Virgo Supercluster 1013 stars.
  • Computing power extractable from a star and with
    an associated planet-sized computational
    structure, using advanced molecular
    nanotechnology, 1042 ops.
  • A typical estimate of the human brains
    processing power 1017 ops.
  • Not much more seems to be needed to simulate the
    relevant parts of the environment in sufficient
    detail to enable the simulated minds to have
    experiences indistinguishable from typical
    current human experiences.
  • Ergo the potential for approximately 1038 human
    lives is lost every century that colonization of
    our local supercluster is delayed
  • Equivalently, about 1031 potential human lives
    per second.

27
A more conservative estimate
  • Dont assume non-biological instantiation of the
    potential persons.
  • Suppose that about 1010 biological humans could
    be sustained around an average star.
  • Then the Virgo Supercluster could contain 1023
    biological humans.
  • This corresponds to a loss of potential equal to
    about 1014 potential human lives per second of
    delayed colonization.
  • Even with a conservative estimate, the potential
    for one hundred trillion potential human beings
    is lost for every second of postponement of
    colonization of our supercluster.

28
Safety beats speed!
  • Lifespan of galaxies is measured in billions of
    years
  • Time-scale of any realistic delays measured in
    years or decades
  • Therefore, consideration of risk trumps
    consideration of opportunity cost.
  • A single percentage point of reduction of
    existential risks would be worth (from a
    utilitarian expected utility point-of-view) a
    delay of over 10 million years.

29
A rule of thumb for utilitarians
  • The Maxipok rule
  • Maximize the probability of an okay outcome,
    where an okay outcome is any outcome that
    avoids existential disaster.

(2003) Astronomical Waste The Opportunity Cost
of Delayed Technological Development, Utilitas,
Vol. 15, No. 3, pp. 308-314.
30
Policy implications?
  • More research into methodology
  • More research into specific risk categories
  • Build institutions and scientific/policy
    communities concerned with ERs
  • Specific efforts to reduce specific threats
  • Pandemic surveillance
  • NEO surveillance
  • Nanotech safety and defense design-ahead?
  • Friendly AI research
  • Foster peace, cooperation, and frameworks for
    international coordination
  • Differential technological development?

31
- END -
Write a Comment
User Comments (0)
About PowerShow.com