Re: Paradox--was Re: Active shields, was Re: Criticism depth, was Re: Homework, Nuke, etc..

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jan 11 2001 - 23:20:55 MST


John Marlow wrote:
>
> Yeah but the point is, all power is not concentrated
> in a single individual. A leader who goes berserk can
> be stopped or killed. You hand all weapons of mass
> destruction to an orbiting AI, you got problems.

A transhuman AI doesn't *need* weapons of mass destruction. So we may as
well minimize our problems by keeping WMDs out of the hands of humans.

Obviously, I'm not advocating handing all the WMDs over to an "orbiting
AI" of human-equivalent or lesser capacity; *that* would be stupid.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:47 MST