Re: >H: The next 100 months

From: Xiaoguang Li (xli03@emory.edu)
Date: Thu Oct 21 1999 - 13:08:16 MDT


        here's my understanding of Eliezer's argument for actively
pursuing the creation of a power:

        destruction is easier than construction: given humanity's
lackluster record with past technological advances, nanowar (read:
armagedon) is virtually certain.
        evolution is messy: even if humanity survives the advent of
nanotech and gives rise to transhuman powers, these powers would have no
more chance of being benevolent than powers designed de novo. the
dichotomy -- if transhuman powers retained evolutionary baggage such as
emotional attachments or moral inhibitions, then they are prone to
inconsistencies and self-contraditions and are therefore untrustworthy; if
they did not, then initial conditions are insignificant, and there's no
functional difference between transhuman powers or powers who arise
otherwise. the caveat -- AI is easier than IA, so this scenario would
require active suppression of technology, the ills of which are
well-known.
        power before nanowar: creation of a power designed de novo before
nanowar will result in a singularity -- that is, all bets are off (read:
no more house advantage against the survival of humanity).

        now for my doubts. does the creation of a power really increase
the chances of our survival? it seems that the odds of humanity
surviving a singularity are significantly lower than a coin flip. given
the enormous difference between the creation and its creator, it seems
most likely (> 99%) that humanity would not be significant to a power.
that is, perhaps less significant than virion particles to a human
being.
        it is true, however, that a fledgling power would probably protect
itself and incidentally prevent a nanowar on its host planet. but what
after that? if the power needs to rearrange the planet to provide
substrate for its intelligence, would it pause to consider the parasites
on the planet's surface? what does the term "benign power" really mean? is
this "benign power" to favor the survival of humanity, against profoundly
implausible odds? or, do we thank our stars if the power would only make
its exodus quickly and leave us untouched?
        indeed, next to dismantling the solar system, the exodus seems a
distinct possibility. however, if the power leaves humanity as it was,
then is not humanity free to pursue the old road toward nanowar and
self-annihilation all over again?
        if the above were true, it brings us to the trichotomy -- 1. the
power may interfere on humanity's behalf; 2. the power may functionally
act against humanity's survival; 3. the power may leave humanity more or
less untouched. the first possibility is slim to none, the second is moot,
and the third would not improve humanity's odds beyond current
estimations.

        thus, it seems the advantage of creating a power asap is not very
obvious. moreover, no better approach seems to be in sight. if one or more
of the above risks were not significantly revised, humanity's future looks
grim indeed.

xgl



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:34 MST