Re: IA vs. AI was: longevity vs singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Aug 01 1999 - 03:42:32 MDT


phil osborn wrote:
>
> (In fact, according to Conrad Schneiker - co-inventor of the
> scanning/tunneling array chip - Drexler also originally thought that
> nanotech itself could be kept under wraps and carefully developed under the
> wise and beneficent supervision of MIT technocrat types. It was only,
> according to Conrad, when he threatened to publish on nanotech himself, that
> he managed to force Drexler's hand on this, which resulted in Engines,..
> Foresight, etc.)

Where are you getting this from? I'd be extremely interested in any
biographical information about Drexler - he's the closest thing I have
to a role model.

> While there are inherent economic efficiencies in cooperation, and the
> universe does set upper limits on the size of a practical single-focus
> intelligence, there are even more challenging issues for a really smart AI.

No, there are not inherent economic efficiencies in cooperation. Any
time you have two independent minds that are remotely similar, there's
going to be duplication of function and thus duplication of processing.
Only silly human programmers with badly designed CPUs have to stoop to
that sort of thing.

Even if the lightspeed limit requires local duplication of processing,
the resultant spectrum of local/global consciousness would almost
certainly NOT spew up a recognizable individual. Personalities do not
happen by coincidence. Personalities are too complex to be forced by
optimized architectures. Any deliberately designed system will not give
rise to individuals unless that is the deliberate intention of the designer.

> You yourself have undoubtably run head on into the problem of achieving
> "visibility." The more out on the end of the bell you are, the less likely
> you are to find a soul mate.

Or someone to take over if you get hit by a truck. But I suppose I echo
the general sentiment.

> It is only when you have a perceptual/emotional relationship with another
> consciousness in real-time that you can get a real mirror of your "soul."
> This is why people place such a high value on relationships.

I really think you're overestimating both the damage caused by lack of
relationships - nobody understands me, probably nobody ever will, big
deal - and the degree to which this damage is a necessary feature of
intelligence rather than a quirk of an evolved mind. Besides, any
mind-plus-environment system can easily be converted to a standalone
computer simulation. But the main thing is, I just don't see where the
cognitive problem is going to come from. Minds don't require external
environments - human minds, yes; generic minds, no. You can get
arbitrarily complex "sensory" inputs just through reflectivity.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:37 MST