Re: globalization of fear

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Aug 14 2002 - 13:40:26 MDT


Eliezer S. Yudkowsky wrote:
> Brian D Williams wrote:
> >
> > So is a nuked Manhattan.
>
> Yes. For that reason, I will not nuke Manhattan.

Actually, let me rephrase that: I wouldn't nuke Manhattan except to
prevent something even worse, i.e., 100% lethal engineered virus gets
loose, out-of-control replicating nanotechnology, or some other
existential risk of a class that is most effectively dealt with via
nuclear weapons. There are things much worse than nuclear weapons.
There are even things worse than all-out nuclear war. But in the simple
ordinary world of human politics, I agree that nuking Manhattan can be
regarded as "absolutely unacceptable".

It will probably happen anyway unless the Singularity happens first. I
don't see Homeland Security being able to stop it forever. Although I
suppose an ultrafast infrahuman AI might be able to do it.

Incidentally, although I bear no ill will whatsoever to any Americans
who happen to be of Arab descent, I would like to offer the friendly
advice that if I thought there was any significant chance of Manhattan
being nuked by Jewish terrorists, I would get the hell out of the US
now. I don't consider myself Jewish in any way - if I converted to
Catholicism I wouldn't be a Jewish Catholic, and I see no reason why I
should be regarded as a Jewish atheist - but I also know damn well that
the enraged nation, looking around for someone to hit, wouldn't care.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:16:06 MST