From: John Marlow (johnmarrek@yahoo.com)
Date: Fri Jan 12 2001 - 19:26:38 MST
Well, here's the deal with that, inmy view:
relinquishment of nanotech is absurd; the benefits are
too great and therefore many will pursue it. One can
argue that it will destroy the world, and one can
argue that we'll control it. With an AI that's smarter
than we are, and actually sentient, the credits
disappear; there are nothing but debits. Whatever
benefits it brings are nullified by the certainty
that, sooner or later, it will be beyond our control.
Sheep do not labor to build better wolves.
john marlow
--- John Clark <jonkc@worldnet.att.net> wrote:
> John Marlow <johnmarrek@yahoo.com> Wrote:
>
> >I suggest to you that the entire effort to
> create and
> >empower a nanny AI can end ONLY in disaster.
>
> Might be true, might not be true, it doesn't matter
> because one thing is certain, if Eliezer
> doesn't make an AI somebody else certainly will.
> It's only a matter of time. Personally I'd
> rather Eliezer get there first, but your mileage may
> vary.
>
> John K Clark jonkc@att.net
>
>
>
>
>
>
__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.
http://personal.mail.yahoo.com/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:49 MST