Re: >H ART: The Truman Show

From: den Otter (neosapient@geocities.com)
Date: Tue Jun 23 1998 - 04:14:12 MDT


Michael Nielsen wrote:
>
> On Mon, 22 Jun 1998, den Otter wrote:

> > Since AIs will presumably be made without emotions, or at least with
> > a much more limited number of emotions than humans, you don't have
> > to worry about their "feelings".
>
> For the record, I may as well note that I think this is a highly
> questionable assumption. On what do you base it?

The fact that our brains happen to work with certain emotions doesn't
automatically mean that this is the *only* way to achieve intelligence.
Evolution is blind, doesn't seek perfection like we do. Humans have
surpassed nature in many other areas, so why not with AI?
 
> One final questgion, before moving on to your next comment: Upon what do
> we base our values, if not some form of emotional / irrational
> attachment? It is certainly advantageous to have a reasonably strongly
> held value system; apathy and inaction is the alternative. Emotions
> seem to be a key factor in maintaining such value systems.

Well, I *don't want* AIs to resemble humans with their complex emotion-
based value systems. We need obedient servants, not competition. So if
it turns out that you *can* have intelligence without a "will", then
that should be used to make useful "genie-AIs". If this isn't possible,
it might be better to make no AIs at all.
 
> > Also, one of the first things you
> > would ask an AI is to develop uploading & computer-neuron interfaces,
> > so that you can make the AI's intelligence part of your own. This would
> > pretty much solve the whole "rights problem" (which is largely
> > artificial anyway),
>
> What do you mean, the rights problem is "artificial"?

It means it's not an "absolute" problem like escaping the earth's
gravity, or breathing under water etc. It is only a problem
because certain people *make* it a problem, just like there
was no real "drug problem" before the war on drugs started.
Calling something a problem is more or less a self-fulfilling
prophecy. Unless the AIs start demanding rights themselves,
there is no reason to grant them any. If we're smart, we'll
make the AIs so that they'll never feel the need to do this.

> > since you don't grant rights to specific parts
> > of your brain. A failure to integrate with the AIs asap would
> > undoubtedly result in AI domination, and human extinction.
>
> This seems to be an unjustified assumption. All other forms of life in the
> world haven't died off with the coming of human beings.

No, but many *have* died. Besides, although clearly inferior, humans
would
probably still be a (potential) threat to SI (after all, they have
created it, or at least it's predecessors). Just like early man
hunted many dumber, yet clearly dangerous predators to extinction,
the SIs might decide to do the same. Just to be on the safe side.
After all, they won't dependent on the humans in any way.

>>Some of our near relatives amongst the primates are still doing okay.<<

Yes, after being nearly hunted to extinction they now get the privilege
to sit in cages for our amusement, or (if they're really lucky) they
may hang around in their original habitat, with only the occasional
poacher or camera team coming by. What a glorious existence!
 
> > P.s: I'm almost certain that our ethics will become obsolete with
> > the rise of SI, they are simply too much shaped by our specific
> > evolution etc.
>
> This may be a tip as to why an SI may share our ethics: their
> evolutionary path includes us. It depends upon how fast their own
> evolution continues.

An electronic SI (or even AI) will be able to outperform, at least in
matters of speed, raw power & capacity any (biological) human by many
orders of magnitude. Therefore, it's better to make sure that the
first SIs will be *us*.

As for our ethics: most of them are shaped by our weaknesses and
resulting interdependence. A SI could very well be the first
(intelligent) life form that is totally self-sufficient, and
thus independent. This definitely changes the rules of the game.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:12 MST