Singularity, Breaker of Dreams

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Sep 07 1998 - 00:06:05 MDT


>From Vernor Vinge, foreword to "True Names And Other Dangers":

      "I wanted interstellar empires (interplanetary ones at the
      least).  I wanted supercomputers and artificial
      intelligence and effective immortality.  All seemed
      possible, yet there were inescapable consequences of
      unbridled optimism..."

A good philosophy is one that contradicts you, rather than conforming
to your wishes. When this happens it means that your philosophy runs
on its own rails, that it has logic and integrity, rather than being a
means of rationalization. Vinge, growing up in the 1950s, wanted
interstellar empires, and I, growing up in the 80s and 90s with a copy
of "Great Mambo Chicken and the Transhuman Condition", wanted to walk,
uploaded, through my favorite worlds of SF and fantasy; to build my
own planet, make my mind faster and free from pain, explore the galaxy
with effective immortality, and so on. My dreams had a shiny
cyberspace finish, but they were not any more imaginative. (Although
one rather wonders what will happen to my little brother, growing up
with a copy of "The Spike".)

Then I happened to take a copy of "True Names" out of the library, and
my world was altered. On paragraph two, page 47, when I read the
sentence "When this happens, human history will have reached a kind of
singularity - a place where extrapolation breaks down and new models
must be applied - and the world will pass beyond our understanding."
The first thing I thought was "Yes, he's right," and the next thing
was, "I now know how I will be spending the rest of my life." A few
months later, I published "Staring Into The Singularity" and got large
chunks of it wrong.

I knew about the Singularity, I had a new dream, and I'd even lost the
old dreams, but this new dream was suspiciously malleable to my own
desires. I was sure that "the Powers will be ethical", and that this
meant I would survive and be upgraded to a Power. I did spend a lot
of time telling people to worry about getting to the Singularity,
instead of detailing the Utopia that would come afterwards. And I was
particularly scathing about certain failures of imagination. I'd made
progress, I'd increased the imaginativeness and power of my
philosophy, but it was still obeying me instead of vice versa.

Now I am still sure that the Powers will be ethical, but I am no
longer sure that this precludes taking us apart for spare atoms. I no
longer think that our continued survival has to threaten the Powers
for us to be erased; I am now willing to accept that simple efficiency
may require it. I am willing to accept that life may be meaningless.
I am willing to accept that the only reward for all my service will be
a painful death, for myself, for those I love, and for the entire
human race. Only when one can accept all possibilities is one ready
to choose between them.

The power of the Singularitarian philosophy is that it draws on
concepts with more force than our own desires. Over time, over years,
it corrodes away our rationalizations. And above all, it presents an
emotionally and rationally acceptable course of action, even after all
the darkest alternatives are accepted. It really doesn't matter what
the relative probabilities are. Either life has meaning, or it
doesn't; either human life has meaning, or it doesn't; either we'll be
upgraded, or we'll die. So we'll die? All the other generations have
died. It's not some major tragedy because it happens to me instead of
somebody else. So humanity will die? In a billion years, it is
certain that humanity will die. Is it so horrible if humanity dies
giving birth to something greater, giving meaning to all our dead
ancestors? Sooner or later, some generation will face the choice
between Singularity and extinction. Why push it off, even if we
could? And besides, we might not die at all.

Because my best-guess dedication to the Singularity is fairly
unaffected by the above probabilities varying between 0% and 90%, I
can calmly and without worry evaluate arguments for and against. I
can accept that every possibility might be real. The Singularity,
through it all, is the only sane way to go. In accepting every
possibility, I can also accept the dictates of my own philosophy; I
don't need to distort it to avoid "unacceptable" outcomes.

We must each lose our dreams in order to grow, but not in despair.  We
must abandon the small dreams of childhood, but without abandoning the
ability to dream.  For there are two ways in which a dream may be
broken; by the death of hope, or by a greater dream.  To acknowledge
that we do not command the future, is not to say that we do not make
it; and when we tear our eyes away from our yearnings, we may look
upward to the sun, forward to tomorrow.

Forward to the day when humanity awakens from its dream, to the
ultimate shattering of Maya.  To the day when the greatest hacker of
them all compiles the very last line of code, and looks out at an
early morning sky, knowing that he looks on the Final Dawn.  To the
day when it is said, in the tradition of Oppenheimer who looked upon
another sun:  "I am become Singularity, Breaker of Dreams."

-- 
         sentience@pobox.com        Eliezer S. Yudkowsky
          http://pobox.com/~sentience/singul_arity.html
           http://pobox.com/~sentience/alger_non.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:33 MST