From: Brian Atkins (brian@posthuman.com)
Date: Tue Jan 08 2002 - 02:11:35 MST
Spudboy100@aol.com wrote:
>
> It seems that most people on this list believe that Singulartity will arrive
> in time to save them. That medical advances will assuredlly keep them in fine
> fettle, so that when the technology is there, they will merely upload, ala
> Permutation City & Diaspora.
>
> What if this doesn't occur in time? What if both the hardware and the
> software to accomplish this sea-change in life on old planet earth, doth take
> its own sweet time?
> Will most of the Extropians hedge their bets and have themselves Alcor-ed?
>
I've been converted into a firm believer in eliminating unnecessary
risks to my long term survival. Especially relatively easy things to
do like signing up with Alcor. So yes, I am hedged. The two things you
are talking about are almost completely separate issues, and any
combo Extropian/Singularitarian would have to have a very interesting
reason to not sign up for cryonics. No matter how soon you think the
Singularity is arriving, you still have to anticipate accidents, diseases,
etc. You'd look pretty darn silly to die irretrievably a year before the
big event.
<rant mode on, aimed at no one in particular>
What's interesting is that the more effort I put into working on this
stuff, and the more I see achieved, and the closer we get, the more real
the end result feels to me. And the higher my level of dread at the idea
of somehow missing out on it all. If you're not feeling much dread when
you consider doing something unnecessarily risky, then you should be aware
that you probably don't have a very good appreciation of what you've got to
lose both right now and in the future. This kind of complacency generally
leads to making stupid mistakes that are quite embarrassing in hindsight
(if you should be so lucky as to still be alive). Unfortunately this is
one of those lessons that many people including myself have to run into
headfirst before finally "getting it". Sometimes you don't see the risk
or take it seriously until it is too late.
Our society (as demonstrated by the ease of the 9/11 attack) is extremely
complacent, and still very much so even now. Not only is this affecting
our security in general, but also it is a huge factor in limiting the
audience for transhumanism. No one who is content with their life and
perceived future will be very interested. Paradoxically, the benefits
that modern technology and prosperity bring to people also makes them
blind to what is missing or what can be improved. This complacency issue is
probably what plays a role in taking down whole societies like the Roman
Empire. Don't let the same thing happen to us- go create some discontent
now! :-)
<rant mode off>
For me, the whole Singularitarianism (note: big difference in this, and
those who just want to sit around waiting and watching) thing is just
another way to reduce a reducible risk. The bonus is it does so for the
whole planet instead of just me. We've even made it simpler to accelerate
the Singularity than signing up with Alcor- just visit our donations page
for instructions...
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.singinst.org/
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:33 MST