Re: Cryonics Thoughts

From: Keith M. Elis (hagbard@ix.netcom.com)
Date: Wed Dec 02 1998 - 10:48:08 MST


Eliezer S. Yudkowsky wrote:

> <OFFENSE="Objectivists">
> The Singularity requires your services. You can't freeze yourself and you
> can't commit suicide. You're just going to have to live through every minute
> of it, like it or not, because that is the logically correct thing to do.
> </OFFENSE>

Fortunately, I am not an objectivist. :)

I read /Staring.../ and I understand your reasoning, but there *may* be
a difference between the logically correct answer, and the correct
answer. What makes you think that this will no longer be true with
superintelligence? You do think this, no? If not, then how can you
possibly suggest that the Interim Meaning of Life is 'get to the
singularity ASAP'? If you don't think that the uncertain gap between
what is logically correct and what is correct can *ever* be closed, then
what good is a singularity? If you do think it can be closed, then what
makes you think this?

>
> If all the smart people have themselves frozen, who exactly is going to
> develop nanotechnology?

To be totally pedantic, not all the smart people will freeze themselves
because smart people disagree on the prospects of cryonics. There's just
enough uncertainty to keep everyone from taking the a priori 'life is
good' to the a posteriori empirical 'cryonics is good.' I am suggesting
that one who has already taken the first step of faith (making 'cryonics
is good' an a priori 'logical truth') might see freezing oneself alive
as a viable option. Then again, one might not.

Keith



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:53 MST