From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Oct 01 2000 - 11:04:32 MDT
Eugene Leitl wrote:
>
> But if I make a mistake, we've got a bunch of dead researchers who
> wouldn't listen. If they make a mistake, we've got millions of
> lightyears of planets full of dead people. (Aliens are also people, of
> course). Dunno, these odds sound good to me.
Eugene, if you make a mistake, you've got a planet full of people who were
wiped out by grey goo. As for the scenario where millions of lightyears of
people get wiped out - if it were possible for one little planet to make that
kind of mistake, and there are that many aliens in range, someone *else* would
have already done it to *us*. It's just the Earth that's at stake.
And above all, if you make a mistake - you'll have made a mistake that
involved deliberate murder.
According to your theories, we have a world where everyone is free to murder
anyone who works on a technology that someone doesn't like. According to you,
the people destroying GM crops are not doing anything immoral; if they have a
flaw, it's that they aren't bombing Monsanto's corporate headquarters. If you
work on an upload project, then anyone who distrusts uploads and thinks AI is
safer has a perfect right to stop the uploading project by killing as many
people as necessary. *I* would have the right to kill *you*, this instant,
because you might slow down AI in favor of nanotechnology.
Do you really think that kind of world, that systemic process for dealing with
new technologies, would yield better results than the nonviolent version?
Even Calvin and Hobbes know better than that.
C: "I don't believe in ethics any more. As far as I am concerned, the
end justifies the means. Get what you can while the getting's good –
that's what I say. It's a dog-eat-dog world, so I'll do whatever I
have to do, and let others argue about whether it's right or not."
<Hobbes pushes Calvin into a mud hole.>
C: "Why did you do THAT?"
H: "You were in my way. Now you're not. The end justifies the means."
C: "I didn't mean for everyone, you dolt! Just ME!
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:19 MST