From: John Marlow (johnmarrek@yahoo.com)
Date: Fri Jan 12 2001 - 19:19:05 MST
**See below.
--- John Clark <jonkc@worldnet.att.net> wrote:
> John Marlow <johnmarrek@yahoo.com> Wrote:
>
> >Handing all of the major hardware to a single,
> INHUMAN
> >party is insanity.
>
> It's unlikely anyone will do so, however when a
> super intelligent machine
> is ready to take over it will not need to ask
> anybody's permission, it will
> just do so, nobody could stop it. it might be nice
> to us, it might not.
**Yah. That was, in a way, my point. Whether we hand
over control or it's taken from us--the result is not
good for us.
>
> >Anything purely logical would
>
> be impossible.
>
> >Anything emotional is itself unpredictable and
> dangerous.
>
> Well certainly. But I don't remember anyone
> guaranteeing that the singularity
> would be fun.
>
> >Marlow's Paradox
>
> Perhaps I'm missing something but, where is the
> paradox?
**We can't do either? What--you want me to call it
Marlow's Dilemma?
;)
**The paradox, dillemma, inherent contradiction,
whatever, is this--may seem to feel AI control is what
we should or must do, there are two choices (emotional
AI, dispassionate AI), and we're damned with the one
and damned with the other. No rational choice is
possible; both are suicidal.
john marlow
>
> John K Clark jonkc@att.net
>
>
>
>
>
>
>
>
>
>
>
>
>
>
__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.
http://personal.mail.yahoo.com/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:49 MST