From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jul 10 2002 - 01:26:51 MDT
Samantha Atkins wrote:
> spike66 wrote:
>>
>> The reason the other dangers might be more effective is that
>> there might be a great majority of capital holders who are not
>> convinced that the singularity poses a real risk, or that even if
>> a singularity is possible, it should not be avoided. Yet we
>> can all see that the danger of nuclear, bio and even nanotech
>> warfare and terrorism is ever increasing. (The Palestinians
>> and Israelis are trying to start the next world war, for instance.
>> They have been successful in drawing even peaceful-minded
>> extropians and transhumanists into their conflict, as many here
>> have chosen one side or the other. Dont give in to that.)
>
> You must be joking. Israel has acted in ways that we and the UN would
> jump all over if it was any other country (other than ourselves). But
> that conflict is incapable of starting a world war. At least it is
> highly unlikely.
Game, set, and match to Spike.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:15:17 MST