Re: >H Re: transhuman-digest V1 #562

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 09 2000 - 10:07:09 MST


"D.den Otter" wrote:
>
> How can you be so sure that this won't be the case? In the
> absence of hard data either way, we must assume a 50% chance
> of AIs causing our extinction.

That's good enough for me! What changed your mind?

> Oh golly, we've just been
> reduced to a second-rate life form. We no longer control
> our planet. We're at the mercy of hyperintelligent machines.
> Yeah, that's something to be excited about...

Sooner or later, we're gonna either toast the planet, or come up against
something smarter than we are. You know that. We've agreed on that.
Your sole point of disagreement is that you believe that you'll be
better off if *you're* the first Power. But Otter, that's silly. If
transforming me into a Power might obliterate my tendency to care about
the welfare of others, it has an equal chance of obliterating my
tendency to care about myself. If someone else, on becoming a Power,
might destroy you; then you yourself, on becoming a Power, might
overwrite yourself with some type of optimized being or mechanism. You
probably wouldn't care enough to preserve any kind of informational or
even computational continuity. Both of these theories - unaltruism and
unselfishness - are equally plausible, and learning that either one was
the case would greatly increase the probability of the other.

So, given that there's also a 50% chance that the Powers are nice guys,
or that no objective morality exists and Powers are freely programmable;
and given also that if the Powers *aren't* nice guys, then being the
Power-seed probably doesn't help; and given that your chance of winning
a competition to personally become the Power-seed is far more tenuous
than the chance of cooperatively writing an AI; and given that if we
*don't* create Powers, we're gonna get wiped out by a nanowar; and given
the fact that uploading is advanced drextech that comes after the
creation of nanoweapons, while AI can be run on IBM's Blue Gene; and
given your admitted 50% chance that the Other Side of Dawn is a really
nice place to live, and that everyone can become Powers -

In what sense is AI *not* something to be excited about?

-- 
               sentience@pobox.com      Eliezer S. Yudkowsky
                  http://pobox.com/~sentience/beyond.html
Typing in Dvorak         Programming with Patterns  Writing in Gender-neutral
Voting for Libertarians  Heading for Singularity    There Is A Better Way


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:26:10 MST