From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jan 22 2002 - 10:07:46 MST
Harvey Newstrom wrote:
>
> Eliezer S. Yudkowsky wrote,
> > Maybe I'm wrong, but I really don't think so. I don't know how many
> > people would turn down immortality if you asked them the question
> > theoretically at a cocktail party, but ask them for real! - I really do
> > think that the refusers will be in a minority.
>
> Haven't people already done this? Most people *do* refuse immortality.
> When life extensionists appear on TV or radio, most of the audience do think
> that there is something wrong with extending lifespan. Religious people
> think its evil. Naturists think its unnatural. Luddites think its
> dangerous. Self-serving, atheistic, technology-loving, far-thinking
> pioneers are very rare.
>
> We have life-saving technology today that people don't use. People fight
> against using condoms, wearing helmets, wearing seatbelts, quitting smoking,
> eating health foods, etc. Even when we know stuff can kill us, most people
> avoid life extending techniques.
Eating health food costs money. Quitting smoking costs willpower. There
are costs associated with lifespan extension. I agree that a fifty-year
Slow Singularity model, like you appear to be using, might kill off a lot
of the currently existent six billion, even if they could have survived by
"just" giving up cigarettes. (Have you heard of Nancy Reagan's new
program to combat homelessness? It's called: "Just get a house.")
Eventually, though, even under a Slow Singularity model (and immediately,
under mine) immortality... and other things, such as not needing to work
all day in a life-force-draining environment... will reduce to a very
simple action: Saying "Yes" to the question "Do you want to be
immortal?" If anyone has the ability to voluntarily go on living their
messed-up life while watching other people become truly happy, it will be
because they aren't watching - because the Refusers are on Earth, and the
heirs to the humanity's real destiny are elsewhere, so as to eliminate
undesired (non-volitional) mental anguish on the part of those who
remain. There are not many good reasons for keeping a human mind any
longer than necessary, but the ability to stay in interaction with the
undecided for a couple of years, and persuade as many people as possible
to follow the path of transhumanity, might be one of those rare reasons.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:11:53 MST