From: Barbara Lamar (shabrika@juno.com)
Date: Tue Sep 05 2000 - 17:58:19 MDT
On Tue, 5 Sep 2000 13:41:36 -0700 "Jason Joel Thompson"
<jasonjthompson@home.com> writes:
> Is it possible that in order for AI to be truly successful, we're
> going to
> have to give it the keys to the car?
This strikes me as one of the most important questions to ask, not
necessarily about AI in general, but about AI at the point where it
becomes more intelligent than the most intelligent human.
There's also the following related question: If humans create brilliant
"offspring" who can think circles around the human mind, would humans
WANT to impose their own intellectually inferior decisions on these
brilliant children of theirs?
Many people, probably the majority, want to think of our species as
continuing for many generations into the future. If not exactly as we
are now, then as enhanced humans, but still recognizably human. Even
those humans who fully expect to die within what's now considered a
normal lifespan, want to think of the species as continuing, although
logically I can't see why it would matter one way or another to someone
who's dead (unless they expected to reincarnate within another human
body).
I myself can't see any reason for the human species to continue in
anything like its present form when (I should say, IF, recognizing the
uncertain and precarious nature of time travel [24 hours into the future
each day]) SI becomes reality. Is this sad? I'm a little surprised to
note that I don't find it particularly sad. It's more exciting than sad.
I'd be interested to know how others feel about the prospect of being
among the last members of the human species.
Barbara
________________________________________________________________
YOU'RE PAYING TOO MUCH FOR THE INTERNET!
Juno now offers FREE Internet Access!
Try it today - there's no risk! For your FREE software, visit:
http://dl.www.juno.com/get/tagj.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:48 MST