From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 27 2001 - 16:56:20 MDT
"natashavita@earthlink.net" wrote:
>
> "The blood of billions of people is on the hands of people who oppose these technologies--what if you're wrong?" asked Robert Bradbury (referring to missed opportunities to develop life-saving biotechnologies) during a Q&A, drawing a round of applause.
>
> "Get yourselves into therapy," psychiatrist Dr. Jerry Lemler, medical director of Alcor Life Extension Foundation, advised these groups."
Lemler's quote is not usable. It is psychiatry abuse.
Robert Bradbury's quote needs to be reworked before it's a good soundbite
(no offense, Robert). Frankly, the whole "blood of billions" thing is a
quote that I have the feeling will work far better *against* us than for
us. It's a poor blaster that doesn't point both ways, as Salvor Hardin
once said, and right now people are still very much in the habit of
thinking of technology as a possible benefit and a possible risk, with
suppression being thought of as representing the status quo. You have to
*be* an Extropian before tech suppression sounds to you like a risk, or
before you think of life extension as an individual right rather than a
weirdling possibility.
A good quote is:
"Suppressing technologies can *kill* people."
If you want to use the phrase "blood of billions", it has to be in a
paragraph, and you have to work up to it, i.e. SIAI might say to an
audience: "We are trying to save the world, and Bill Joy is trying to
convince you to shut us down, and if he does, the blood of billions is on
his hands. And Bill Joy does not seem to take that risk seriously."
Furthermore, you need to be able to convey, plausibly, to the audience,
that you have good reason to believe you are trying to save the world, or
that you have good reason to believe you can make a whole lot of people
immortal, or talking about the blood of billions just makes you look
insane.
By contrast, talking about technology suppression as deadly - conveying
the image of a loaded gun with a finger on the trigger - is something that
I think people will be able to get a lot faster.
"Technology suppression can *kill* people" is the short quote. A more
extended argument might go something like this: "Technology suppression
can kill people. It's as deadly as a loaded gun. If you suppress
biotechnology that's leading up to a cancer cure, then you have killed
people who are dying of cancer. If you suppress the "golden grain"
technology, the vitamin-A enhanced rice that prevents children in
developing countries from going blind due to malnutrition, then you have
blinded those children as surely as if you personally burned their eyes
out. The Luddites don't think about that. They talk about the dangers of
technology but they never think about the dangers of what *they're*
doing. They talk about risk assessment but that's just a tactic to them.
They don't understand risks at all. They're basically people who don't
like technology, they think it's yucky, and so they try to suppress the
whole thing because *they* don't like it. They don't think about the
other people that might get hurt. The Luddites are not cautious people.
They want *other* people to do risk assessments. The Luddites themselves
are completely careless. They're like people waving a loaded gun around."
**
Technology suppression can *kill* people.
And the Luddites are waving a loaded gun around.
**
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:19 MST