The Human Importance of the Intelligence Explosion - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

The Human Importance of the Intelligence Explosion

Description:

... Institute for Artificial Intelligence. singinst.org ' ... Does not even require that 'Real AI' is possible! ... Eliezer Yudkowsky Singularity Institute for AI ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 36
Provided by: sssSta
Category:

less

Transcript and Presenter's Notes

Title: The Human Importance of the Intelligence Explosion


1
The Human Importance of theIntelligence Explosion
Eliezer Yudkowsky Singularity Institute for
Artificial Intelligence singinst.org
2
"Intelligence explosion"
  • Concept invented by I. J. Good (famous name in
    Bayesian statistics) in 1965.
  • Hypothesis The smarter you are, the more
    creativity you can apply to the task of making
    yourself even smarter.
  • Prediction Positive feedback cycle rapidly
    leading to superintelligence.

(Good, I. J. 1965. Speculations Concerning the
First Ultraintelligent Machine. Pp. 31-88 in
Advances in Computers, 6, F. L. Alt and M.
Rubinoff, eds. New York Academic Press.)
Eliezer Yudkowsky Singularity
Institute for AI
3
Intelligence explosion hypothesis does not imply,
nor require
  • More change occurred from 1970 to 2000 than from
    1940 to 1970.
  • Technological progress follows a predictable
    curve.
  • Does not even require that "Real AI" is possible!
    (An intelligence explosion could happen with
    augmented humans.)

Eliezer Yudkowsky Singularity
Institute for AI
4
"Book smarts" vs. cognition
  • "Book smarts" evokes images of
  • Calculus
  • Chess
  • Good recall of facts
  • Other stuff that happens in the brain
  • Social persuasion
  • Enthusiasm
  • Reading faces
  • Rationality
  • Strategic ability

Eliezer Yudkowsky Singularity
Institute for AI
5
The scale of intelligent mindsa parochial view.
Village idiot
Einstein
Eliezer Yudkowsky Singularity
Institute for AI
6
The scale of intelligent mindsa parochial view.
Village idiot
Einstein
A more cosmopolitan view
Eliezer Yudkowsky Singularity
Institute for AI
7
The power of intelligence
  • Fire
  • Language
  • Nuclear weapons
  • Skyscrapers
  • Spaceships
  • Money
  • Science

Eliezer Yudkowsky Singularity
Institute for AI
8
One of these things is not like the other...
  • Space travel
  • Extended lifespans
  • Artificial Intelligence
  • Nanofactories

Eliezer Yudkowsky Singularity
Institute for AI
9
Intelligence
  • The most powerful force in the known universe -
    see effects every day
  • Most confusing question in today's science - ask
    ten scientists, get ten answers
  • Not complete mystery huge library of knowledge
    about mind / brain / cognition but scattered
    across dozens of different fields!

Eliezer Yudkowsky Singularity
Institute for AI
10
  • If I am ignorant about a phenomenon,
  • this is a fact about my state of mind,
  • not a fact about the phenomenon.
  • Confusion exists in the mind, not in reality.
  • There are mysterious questions.
  • Never mysterious answers.
  • (Inspired by Jaynes, E.T. 2003. Probability
    Theory The Logic of Science. Cambridge
    Cambridge University Press.)

Eliezer Yudkowsky Singularity
Institute for AI
11
For more about intelligence
  • Go to http//singinst.org/
  • (Or google "Singularity Institute")
  • Click on "Summit Notes"
  • Lecture video, book chapters

Eliezer Yudkowsky Singularity
Institute for AI
12
The brain's biological bottleneck
  • Neurons run at 100Hz
  • No read access
  • No write access
  • No new neurons
  • Existing code not human-readable

Eliezer Yudkowsky Singularity
Institute for AI
13
Relative difficulty
  • Build a Boeing 747 from scratch.
  • Starting with a bird,
  • Modify the design to create a 747-sized bird,
  • That actually flies,
  • As fast as a 747,
  • Then migrate actual living bird to new design,
  • Without killing the bird or making it very
    unhappy.

Eliezer Yudkowsky Singularity
Institute for AI
14
The AI Advantage(for self-improvement)
  • Total read/write access to own state
  • Absorb more hardware(possibly orders of
    magnitude more!)
  • Understandable code
  • Modular design
  • Clean internal environment

Eliezer Yudkowsky Singularity
Institute for AI
15
Biological bottleneck(for serial speed)
  • Lightspeed gt106 times faster than axons,
    dendrites.
  • Synaptic spike dissipates gt106 minimum heat
    (though transistors do worse)
  • Transistor clock speed gtgt106 times faster than
    neuron spiking frequency

Eliezer Yudkowsky Singularity
Institute for AI
16
  • Physically possible to build brain at least
    1,000,000 times as fast as human brain
  • Even without shrinking brain, lowering
    temperature, quantum computing, etc...
  • Drexler's Nanosystems says sensorimotor speedup
    of gtgt106 also possible
  • 1 year ? 31 seconds

Eliezer Yudkowsky Singularity
Institute for AI
17
10,000 years to nanotech?(for superintelligence)
  • Solve chosen special case of protein folding
  • Order custom proteins from online labs with
    72-hour turnaround time
  • Proteins self-assemble to primitive device that
    takes acoustic instructions
  • Use to build 2nd-stage nanotech, 3rd-stage
    nanotech, etc.
  • Total time 10,000 years 4 days

Eliezer Yudkowsky Singularity
Institute for AI
18
Respect the power of creativityand be careful
what you call "impossible".
Eliezer Yudkowsky Singularity
Institute for AI
19
Nuclear, Space, Computer, Biotech, Internet
Revolutions
Ancient Greeks
Industrial Revolution
Hunter-gatherers
Agriculture
Renaissance
Electrical Revolution
Molecular nanotechnology
Eliezer Yudkowsky Singularity
Institute for AI
20
vs.
Nuclear, Space, Computer, Biotech, Internet
Revolutions
Ancient Greeks
Industrial Revolution
Hunter-gatherers
Agriculture
Renaissance
Electrical Revolution
Molecular nanotechnology
Hunter-gatherers
Bees
Chimps Internet
Eliezer Yudkowsky Singularity
Institute for AI
21
Can an intelligence explosionbe avoided?
  • Self-amplifying once it starts to tip over
  • Very difficult to avoid in the long run
  • But many possible short-term delays
  • Argument A human-level civilization occupies an
    unstable state will eventually wander into a
    superintelligent region or an extinct region.

Eliezer Yudkowsky Singularity
Institute for AI
22
Fallacy of the Giant Cheesecake
  • Major premise A superintelligence could create
    a mile-high cheesecake.
  • Minor premise Someone will create a recursively
    self-improving AI.
  • Conclusion The future will be full of giant
    cheesecakes.
  • Power does not imply motive.

Eliezer Yudkowsky Singularity
Institute for AI
23
Fallacy of the Giant Cheesecake
  • Major premise A superintelligence could create
    a mile-high cheesecake.
  • Minor premise Someone will create a recursively
    self-improving AI.
  • Conclusion The future will be full of giant
    cheesecakes.
  • Power does not imply motive.

Eliezer Yudkowsky Singularity Institute for AI
24
Spot the missing premise
  • A sufficiently powerful AI could wipe out
    humanity.
  • Therefore we should not build AI.
  • A sufficiently powerful AI could develop new
    medical technologies and save millions of lives.
  • Therefore, build AI.

Eliezer Yudkowsky Singularity
Institute for AI
25
Spot the missing premise
  • A sufficiently powerful AI could wipe out
    humanity.
  • And the AI would decide to do so.
  • Therefore we should not build AI.
  • A sufficiently powerful AI could develop new
    medical technologies and save millions of lives.
  • And the AI would decide to do so.
  • Therefore, build AI.

Eliezer Yudkowsky Singularity
Institute for AI
26
Design space of minds-in-general
Bipping AIs
Freepy AIs
Gloopy AIs
All human minds
Eliezer Yudkowsky Singularity
Institute for AI
27
AI isn't a prediction problem,it's an
engineering problem.We have to reach into mind
design space, and pull out a mind such that we're
glad we created it...
Eliezer Yudkowsky Singularity
Institute for AI
28
AI isn't a prediction problem,it's an
engineering problem.We have to reach into mind
design space, and pull out a mind such that we're
glad we created it...Challenge is difficult and
technical!
Eliezer Yudkowsky Singularity
Institute for AI
29
"Do not propose solutions until the problem has
been discussed as thoroughly as possible without
suggesting any." -- Norman R. F. Maier"I have
often used this edict with groups I have led -
particularly when they face a very tough problem,
which is when group members are most apt to
propose solutions immediately." -- Robyn
Dawes(Dawes, R.M. 1988. Rational Choice in an
Uncertain World. San Diego, CA Harcourt, Brace,
Jovanovich.)
Eliezer Yudkowsky Singularity
Institute for AI
30
What kind of AI do we want to see?Much easier
to describe AIswe don't want to see...
Eliezer Yudkowsky Singularity
Institute for AI
31
"Friendly AI"...(the challenge of creating an
AIthat, e.g., cures cancer, ratherthan wiping
out humanity)...looks possible but very
difficult.
Eliezer Yudkowsky Singularity
Institute for AI
32
The intelligence explosionEnough power
to...make the world a better place?
Eliezer Yudkowsky Singularity
Institute for AI
33
Someday,the human specieshas to grow up.Why
not soonerrather than later?
Eliezer Yudkowsky Singularity
Institute for AI
34
In a hundred million years,no one's going to
carewho won the World Series,but they'll
remember the first AI.
Eliezer Yudkowsky Singularity
Institute for AI
35
For more information, please visit the
Singularity Institute at http//singinst.org/
Write a Comment
User Comments (0)
About PowerShow.com