Re: The AI revolution (Was: Re: >H ART: The Truman Show)

From: den Otter (neosapient@geocities.com)
Date: Tue Jun 23 1998 - 05:13:07 MDT


Daniel Fabulich wrote:
>
> On Tue, 23 Jun 1998, den Otter wrote:
>
> > Emotions are important for us because they help us to survive. AIs
> > don't need to fend for themselves in a difficult environment; they
> > get all the energy, protection & imput they need from humans and
> > other machines. All they have to do is solve puzzles (of biology,
> > programming etc). If you program it with an "urge" to solve puzzles
> > (just like your PC has an "urge" to execute your typed orders), it
> > will work just fine. No (other) "emotions" are needed, imo. It's
> > like with the birds and planes: both can fly, only their methods
> > are different, and both have their specialties.
>
> I think you're approaching this question from a very different direction
> from many of those who subscribe to this list.

The majority isn't always right ;-)

> Many of us suspect that if
> and when we create an AI, it will not be any easier to "program" than
> people are; that on some level the architecture behind AI will
> sufficiently similiar to that within our own brains that we will not be
> able to selectively remove anything fundamental from this mix, including
> emotions, survival instincts, irrationalities and quirks.

That's just one way to develop AI. There might very well be others. In
any case, it wouldn't be very smart to just make an upgraded human
(with an attitude). Genie AIs is what we want, mindless servants.
 
> Unlike flight, intelligence may well require something as complicated as a
> bird or a person in order to work. If so, there's no reason to assume
> that we could strip emotions out of intelligence like we could take the
> feathers out of flight.

There is also no reason to assume that you *can't* strip (most)
emotions out of intelligence, and still have something useful.
At this point, we're all just guessing of course, but I strongly
suspect that the idea that you can't have emotionless AI,
intelligence without will, is mainly based on a mystification
of the human mind. Our emotions aren't *that* special, they're
just tools for a certain environment, which is mostly different
from that of the AI's. The goals of AI are not survival and
procreation, at least they never should be, but solving
problems for humans. Fight/flight/love/hate/distrust/hunger
for power/sadness etc. would only get in the way of performance,
and make the AI very unreliable and potentially dangerous.

When people (like scientists) deal with a difficult problem,
they tend to focus fully on it, blocking out cluttering
emotions. That's what you want the AI to be: a focussed
thinker, not disturbed by the urge to mate or whatever.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:12 MST