From: den Otter (neosapient@geocities.com)
Date: Thu Jul 22 1999 - 08:07:05 MDT
----------
> From: Spike Jones <spike66@ibm.net>
> Eliezer, why whats the big hurry with this singularity thing?
> Arent you a little curious to see how far we humans could
> go without *that* coming along and doing...who knows what?
> Maybe the new software really wont be interested in keeping
> us carbon units around. I *like* us. spike
Well, Eliezer is right about the Singularity being virtually
inevitable (the only way to stop it is by destroying or at
least massively retarding civilization, which is of course
unacceptable because we need technological progress
to survive), so we might as well accept it. The real issue
is *how* we should cause a Singularity. Eliezer seems to
favor the AI approach (create a seed superintelligence and
hope that it will be benevolent towards humans instead of
using our atoms for something more useful), which is
IMHO reckless to the point of being suicidal.
A much better, though still far from ideal, way would be
to focus on human uploading, and when the technology
is operational upload everyone involved in the project
simultaneously. That way at least some people would
have a fighting chance to become posthuman. In
fact, I'm very surprised that so many otherwise fiercely
individualistic/libertarian people are so eager to throw
themselves at the mercy of some machine. It doesn't
compute.
IMHO, the "mission" of transhumanism should be to
develop uploading (and general intelligence amplification)
technology asap, while at the same time trying to curb the
development of "conscious" AI and other technologies
which may threaten our existence. We don't have to hold
them back forever; we just need to buy us some time to
level the playing field.
Let's not forget that (technological) progress is just a tool;
the goal is our continued survival and well-being.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:32 MST