From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sat Feb 08 1997 - 21:50:40 MST
[Max M. Rasmussen:]
> >BTW i dont think that the spaceprogram is very important right now. It
> >really ought to wait some decades before the technolgy is more refined.
> >There's a huge conscience-implosion going on right now with most of the
> >world focusing on the internet, multimedia, comunication, simulators, cad
> >etc. All of wich will probably lead to some kind of singularity. So
> >sometimes later we probably can fly to space in small private spacecrafts
> >powerered with Drexler-like diamond engines. why not wait and let the
> >explosion happen then?
[E. Shaun Russell:]
> Because it might not happen. Although I agree with your post in
> spirit, it still has tones of singularity-based reliance (which in my
> opinion is of the same calibre as a deus-based reliance.) While I agree
> that there *is* a huge world-focus upon computers and global-village
> communication, I still think that there should be attempts at maximizing our
> [society's] current potential in all the other areas...including space
> travel. Nothing beats a sure thing, and the only way to make things sure is
> to *kineticize* and maximize their potential.
Speaking as someone who thinks that the Singularity is unknowable and
extrapolating beyond it is silly, let me also say that I explicitly
agree with E. Shaun on this one - simply on practical grounds. Our
attempts at space travel, however futile in direct terms, did a HELL of
a lot to advance the Singularity. At a guess, by a decade at least.
None of that would have happened if we'd waited for the Singularity to
save us.
I speculated in my Singularity site about how Powers might solve math
problems, for the simple reason that the speculation might prove
necessary for building first-stage AIs. Any science may by pure
serendipity advance the Singularity, even if - or especially if - it is
directed at goals so astronomical as to practically require a
Singularity to power it.
(Also, I don't believe the ends justify the means. Strictly from a
pragmatic viewpoint, humans aren't built to be pragmatists. When we
try, things go wrong, others stop trusting us, and we get our souls all
icky. So don't rely on the Singularity for moral justification.)
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:09 MST