From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jan 18 1999 - 00:50:28 MST
Chris Wolcomb wrote:
>
> 3) Those who believe the singularity is not inevitable but want it to come as soon as possible.
That's me.
> As an extropian, I want to shape my future around my desires. Any force which attempts to curb my desires will be met with the strongest resistance possible.
As a mortal, I'd like to see humanity survive in any form at all.
Getting the future to conform perfectly to my desires is ludicrous and
would probably result in eternal living hell, no matter how careful I was.
As a Singularitarian, I want a Singularity to happen, but I don't really
care when it happens as long as it's within the next ten thousand years
or so (a mere moment on a galactic time scale). Of course, I could be
totally wrong about the lack of urgency; it could be that every single
second of delay is more terrible than all the pains of human history.
But the moral scales (and my feeble understanding of How Causality
Actually Works) seem to me to tilt far more towards "maximize
probability" than "minimize time".
The question doesn't arise.
My two allegiances don't come into conflict.
It's Singularity or bust.
Okay, maybe nuclear war would just delay things by a century. So then
we have the same option again - Abort, Retry, Singularity. And again.
And again. Until either intelligent life wipes itself out or creates
its successor.
Okay, so maybe the Singularity will wipe out humanity for spare atoms.
Maybe it won't. I accept both possibilities. If I knew "for certain
sure" it was option one, I'd still be a Singularitarian; but as has been
stated, I'm not sure, so the question of divided allegiance doesn't arise.
I race towards the Singularity because every delay in the Singularity
increases the chance that there will be no human Singularity, ever. And
while I might weep over that for my own peculiar logical reasons, your
reason for weeping is that it means that you and everyone you know and
the entire human race will be extinct.
Screw dynamic optimism. The future isn't a playground, it's a
minefield. Okay, so it's a unique minefield in that blind panic will
get you killed even faster than blind enthusiasm, but blind is blind.
The fact is that shiny new technologies always get used for military
purposes, and that technological advance decreases the amount of cash
needed to wipe out human life. Forgive me, but I don't see any reason
for raving optimism.
> According to several Singulatarians, quite literally nothing - it considers 'me' irrelevant.
I don't know that, and if anyone else says they do, they're mistaken. I
simply offer no guarantees and point out that no guarantees are necessary.
> So why should I accept it or want it? If the Singulatarians are right, then the Singularity will sweep away everything we know in its rapid path to increasing complexity.
Again, this is only a possibility.
> I would much rather spend a few centuries in a place like Iain Banks' Culture, than become part of some blind singularity as soon as possible.
Yes, I'd love to spend a few centuries in the Culture, especially the
new revised-history "You can Sublime off any time you get bored"
version. But I don't see any GCUs in Earth's atmosphere, do you?
Evidently this simply isn't an option.
Besides which, the Archive (my own invention; see _Singularity
Analysis_) beats the Culture any day of the week.
> Now, what if I were to start a movement to stop the singularity from occurring? I think I'll call this new political movement - the Anti-Singularity Party. Some might say I I'm going down the path to curbing people's freedom - like freedom to build seed AI's.
I happen to think that nanotechnology could get us all killed. Do you
see me proposing any laws to restrict Zyvex? That trick never works.
Not for me, not for you. Restricting technology is always more
destructive than the technology itself.
> Yet, how is this any different from *them* creating a singularity and forcing me to be either be assimilated or weeded out through cybernetic natural selection?
Very simple. We are not forcing you to do a damn thing. We don't have
the right. We are not superintelligent. You are not superintelligent
either and have no more right than I to impose your views on your fellows.
A Culture Mind might be able to search through enough futures to safely
suppress a technology. I can't. The same applies to coercion. I do
not refrain from coercion and suppression for any fundamental moral
reason, but because they ALWAYS end in disaster.
If you're really right about coercion being morally wrong, I'm sure that
our Future Friends will be able to figure it out as easily as you did.
> So what if most living transhumanists do not want to be absorbed into the 'sublime plenum' that is this singularity? I've heard Singulatarians say "too bad, you will be assimilated by the singularity anyway'.
If that's how the cards are set up, yes. If not, not. It's not
something I think we can influence via initial conditions.
Look, the human brain is finite. It's got a limited number of states.
So you have to die, go into an eternal loop, or Transcend. In the long
run... the really long run... mortality isn't an option.
So why not just go ahead and do it, leave the womb, for good or for
evil, while my grandparents are still alive and before the Earth gets
eaten by grey goo?
Wouldn't it be ironic if you delayed the seed AI, got smothered in goo,
and all along it turned out that the Singularity would have obligingly
provided an environment hedonistic beyond the most fevered imaginings of
the AhForgetIt Tendency?
> Chris Wolcomb.
> GSV Its Only a Matter of Time.
eliezer yudkowsky
the excession
> P.S. Eliezer, we now find ourselves on the opposite sides of a new political spectrum - those of you for the Singularity, and those of us against it. The future is shaping up to be very interesting. :-)
Considering how hard I've been pushing against the opposition, I can't
say I'm displeased to find it coalescing, particularly on my terms.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html -- Who on this list is Culture, who is Contact... and who is SC?
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:51 MST