Re: Is cryopreservation a solution?

From: Geoff Smith (geoffs@unixg.ubc.ca)
Date: Sat Sep 20 1997 - 17:20:15 MDT


On Fri, 19 Sep 1997, Eliezer S. Yudkowsky wrote:

> Geoff Smith wrote:
> >
> > On Wed, 17 Sep 1997, Joao Pedro wrote:
> >
> > > It's my only platform for intelligent (I don't like much the word
> > > "consciousness"), I think there is intelligent life in the universe
> > > besides ours, I think in the future we will built computers much more
> > > intelligent than us. For me, as an individual, my only platform is my
> > > brain.
> >
> > You're going to let computers become more intelligent than you? Haven't
> > you seen Terminator 2 ?!?
> >
> > When computers are more intelligent than you, your only usefullness will
> > be in a zoo or museum.
>
> Wrong.

Darn, I hate being wrong ;)
 
> > I don't plan on being obsolete.
>
> Tough luck.

bummer ;( There goes my plan... I guess I should have bought a crystal
ball!
 
> > geoff.
>
> You need to visit the Singularity website in my .signature below.

Well, I'm lacking web access right now, so why don't you explain?

How about I give you my logic, then you can specifically address my
points.

-If superior computer power/neural nets appear, and I can integrate them
with my brain, I will.

-If I buy a computer to do menial tasks for me, and I can incorpate it as
a servile center of my brain, I will.

-I will not buy a computer that does more than menial tasks(including
pondering philosophy, managing people, etc..) unless it is an incorporated
or entirely servile part of my brain.

-I will do whatever I have to to stay competitive with transhuman and
other human-built and extra-terrestrial intelligent agents who intend on
making my functions obsolete(by their own personal evolution)

I'm more intelligent than a computer right now (if you were really nasty,
you could argue this point), so why exactly will that change, if I adhere
diligently to the points above?

geoff.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:56 MST