From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Mar 25 1999 - 10:59:08 MST
den Otter wrote:
>
> Not necessarily. Not all of us anyway.
The chance that some humans will Transcend, and have their self
preserved in that Transcendence, while others die in Singularity - is
effectively zero. (If your self is preserved, you wouldn't kill off
your fellow humans, would you?) We're all in this together. There are
no differential choices between humans.
> Conclusion: we need a (space) vehicle that can move us out of harm's
> way when the trouble starts. Of course it must also be able to
> sustain you for at least 10 years or so. A basic colonization of
> Mars immediately comes to mind. Perhaps a scaled-up version
> of Zubrin's Mars Direct plan. Research aimed at uploading must
> continue at full speed of course while going to, and living on, Mars
> (or another extra-terrestrial location).
Impractical. Probability effectively zero. At absolute most you might
hope for an O'Neill colony capable of supporting itself given nanotech.
Besides, since only *you* are going to Transcend, why should *I* help
you build a Mars colony?
> Btw, you tend overestimate the dangers of nanotech and
> conventional warfare (fairly dumb tech in the hands of fairly dumb
> people), while underestimating the threat of Powers (intelligence
> beyond our wildest dreams). God vs monkeys with fancy toys.
Let us say that I do not underestimate the chance of a world in which
neither exists, to wit, as close to zero as makes no difference. Given
a choice between ravenous goo and a Power, I'll take my chances on the
benevolence of the Power. "Unacceptable" my foot; the probability
exists and is significant, *unlike* the probability of the goo deciding
not to eat you.
> Any kind of Power which isn't you is an unaccepable threat,
> because it is completely unpredictable from the human pov.
> You are 100% at its mercy, as you would be if God existed.
> So, both versions are undesirable.
So only one human can ever become a Power. By golly, let's all start
sabotaging each other's efforts!
Sheesh. There's a reason why humans have evolved an instinct for altruism.
> We _must_ be choosy. IMHO, a rational person will delay the Singularity
> at (almost?) any cost until he can transcend himself.
If AI-based Powers are hostile, it is almost certain, from what I know
of the matter, that human-based Powers will be hostile as well. So only
the first human to Transcend winds up as a Power. So your a priori
chance of Transcending under these assumptions is one in six billion,
and no more than one can get the big prize. So you'll try to sabotage
all the Singularity efforts, and they'll try to sabotage you. A snake-pit.
If only one human can ever become a Power, your chance of being that
human cannot possibly exceed one in a hundred. Combined with the fact
that AI transcendence will be possible far earlier, technologically
speaking, and that delaying the Singularity greatly increases the
probability of a killing war, and that a Power version of you might be
utterly unidentifiable as being human in origin, I think that the other
branches of probability - in which AI Powers are our *good friends* and
upgrade us *gently* - outweigh the probability of making it alone.
In other words, almost regardless of the relative probability of AI
hostility and AI benevolence, you have a better absolute chance of
getting whatever you want if you create an AI Power as fast as possible.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST