From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Nov 06 2000 - 23:21:47 MST
Darin Sunley wrote:
>
> Spike Jones wrote:
> >
> >Honest to god, life is fun, even those decades
> >that have 3s and 4s in the tens column. Tell us that the AI guys
> >are planning *something* as an escape mechanism, and I mean
> >something more convincing than Clarke's automatic cable cutter
> >on HAL's power cord. Let's see: packing explosives around
> >the SIAI mainframe rigged to explode should HWoMBeP's heart
> >stop...
No, no, no, *no*, NO!
> I just visualized a nanite swarm bleeding out of the computer console,
> enveloping all human beings in the room, and receding, leaving only an
> elegant reliquary [sp?] [container for relics, medieval Catholic church
> thing] containing Eliezer's still-beating heart.
Pretty much, yes. There's a gruesome little story here if anyone wants to
write it.
Friendly AI, rule #7: "If the AI *wants* to violate Friendliness, you've
already lost."
If it gets to the point where explosives packed around the mainframe start to
look reassuring to the clueless, you are already screwed over so thoroughly
that a strategic nuke isn't going to help. Every non-nitwit safeguard happens
*before* a transhuman AI decides it hates you.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:51 MST