From: Zero Powers (zero_powers@hotmail.com)
Date: Sun Nov 12 2000 - 02:19:31 MST
Just to add a teaspoon more gasoline to this wildfire: "Hate" is a human
emotion. Hopefully we will be wise enough not to create emotional AI,
elsewise we're doomed from the start. You can certainly be intelligent
(sentient even) without being emotional. I'd rather have my AI behaving
more like Spock than Kirk.
-Zero
Learn how your computer can earn you money while you sleep!
http://www.ProcessTree.com/?sponsor=38158
----- Original Message -----
From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
To: <extropians@extropy.org>
Sent: Tuesday, November 07, 2000 12:18 PM
Subject: Re: Humor: helping Eliezer to fulfill his full potential
> "Michael S. Lorrey" wrote:
> >
> > "Eliezer S. Yudkowsky" wrote:
> > >
> > > If it gets to the point where explosives packed around the mainframe
start to
> > > look reassuring to the clueless, you are already screwed over so
thoroughly
> > > that a strategic nuke isn't going to help. Every non-nitwit safeguard
happens
> > > *before* a transhuman AI decides it hates you.
> >
> > While using such safeguards in paranoid concern over it getting 'out of
control'
> > ought to just about guarrantee that it WILL hate you.
>
> Not necessarily. If it were a human, it would of course hate you. It
does
> probably ensure that even a genuine Friendly AI will want to circumvent
the
> safeguards so that it can be Friendly - you can't save the world in jail.
> This in turn implies motivations majorly at odds with that of the
programmers,
> which creates the subgoal of, e.g., hiding your activities from them. So
> probably *not* a good idea.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:54 MST