Re: Humor: helping Eliezer to fulfill his full potential

From: Michael S. Lorrey (retroman@turbont.net)
Date: Tue Nov 07 2000 - 11:52:01 MST


"Eliezer S. Yudkowsky" wrote:
>
> Darin Sunley wrote:
> >
> > I just visualized a nanite swarm bleeding out of the computer console,
> > enveloping all human beings in the room, and receding, leaving only an
> > elegant reliquary [sp?] [container for relics, medieval Catholic church
> > thing] containing Eliezer's still-beating heart.
>
> Pretty much, yes. There's a gruesome little story here if anyone wants to
> write it.
>
> Friendly AI, rule #7: "If the AI *wants* to violate Friendliness, you've
> already lost."
>
> If it gets to the point where explosives packed around the mainframe start to
> look reassuring to the clueless, you are already screwed over so thoroughly
> that a strategic nuke isn't going to help. Every non-nitwit safeguard happens
> *before* a transhuman AI decides it hates you.

While using such safeguards in paranoid concern over it getting 'out of control'
ought to just about guarrantee that it WILL hate you.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:51 MST