Re: TERRORISM: Is genocide the logical solution?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Sep 17 2001 - 10:52:05 MDT


Mark Walker wrote:
>
> Robert has taken a lot of grief for this post including being charged with
> the worst crime: engaging in "pro-entropy" (Anders) activities. With some
> reluctance I feel compelled to defend the logic of what Robert is saying
> because I think he raises an important question, namely: what are we willing
> to sacrifice now in terms of lives to advance the effects of the
> singularity?

You can be willing to sacrifice your OWN life. You don't get to decide
whether to sacrifice SOMEONE ELSE'S life.

There are known exceptions to this rule. I talked about one of them when
I said that I would, in theory, back a ground war in Afghanistan and Iraq
to remove their governments from power. (I can't back the current war, in
practice, because I expect our government to mess it up and do something
worse than useless, like "retaliatory air strikes".) But that isn't the
logic of Tit for Tat or even saving the maximum number of equally valued
lives; it's based on my belief that Earth's survival will be directly
threatened by the existence of terrorist groups, and that the transition
from "a world with state-sponsored terrorism" to "a world with no
state-sponsored terrorism and slightly weaker terrorists" is thus worth
fighting for even if a large number of casualties are involved. And this
holds whether the casualties are American civilians or Afghani civilians.

Tyrannical governments cannot be allowed to hide behind civilian shields;
it would give them infinite license and more people would be hurt in the
long run. If the US government were to go the way of Nazi Germany, then I
would understand the absolute necessity for foreign governments to attack
the US even at the cost of civilian lives, even including my own. (And if
I didn't agree with the need, they would have to attack anyway.) In an
entangled world it is possible for moral principles to directly conflict,
and so we cannot be morally comfortable this side of the Singularity, but
it's important to remember that all such conflicts are still bugs in the
Universe, and must be minimized rather than embraced.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:10:46 MST