From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Aug 21 1997 - 22:28:51 MDT
Darren Reynolds wrote:
>
> At 21:52 13/08/97 -0500, Eliezer S. Yudkowsky wrote:
> >Emotions aren't mysterious forces. They can't hide from us. Turing machines
> >are deterministic and wholly causal; any force operating on Deep Thought
> would
> >be explicable as the sum of its parts.
>
> Obvious question: why do you think we are any different?
We aren't. Our emotions aren't hidden - it's obvious to anyone, internally
and externally, that we have these things called "emotions". You were
proposing that Deep Thought had emotions and we didn't know about it. My
response was that emotions are not subjective philosophical constructs that we
can assign or deny at our whim; they are cognitive processes that objectively
exist - in all known cases, obviously so. From the medial forebrain bundle,
to the amygdala, to the mammillary bodies... the entire limbic system... there
are numerous parts of the brain specialized for handling emotions. Emotions
don't just happen, they are evolutionarily designed to serve specific
functions. Insofar as Deep Thought doesn't evolve, wasn't designed to feel,
and contains no hidden processes - which assumptions I debated in the previous
letter - it is not likely to have emotions remotely analogous to those which
take up so much explicit processing power in our own minds.
> >> When Deep Blue won the series recently, I wondered whether it felt a most
> >> primitive sense of achievement, in the same way that our own primitive
> >> ancestors might have felt a sense of achievement if they had lifted their
> >> first stone, and successfully hurled it at a passing antelope. Yes, the
> >> thing is just a mathematical algorithm. But so, apparently, are we.
> >
> >It didn't. Deep Blue may have emergent chess stratagems from the complex
> >physical-process-level chess exploration algorithms. I don't think there was
> >any form of feedback for successful and unsuccessful moves. In training
> mode ...
>
> Yeah, but this misses the point. There IS feedback in the form of
> selection. If Deep Blue makes bad moves, IBM will trash it. There doesn't
> have to be anything in the code. It's the same feedback (and the only
> feedback) that our own evolutionary path had until very recently.
I am not interested in debating what evolution "really" is. The evolution you
name had no effect on Deep Blue's design. Deep Blue was designed by a bunch
of humans; it was not designed by selection of any type. The weightings may
have been evolutionary; the architecture was not.
> >Except from a functionalist perspective, there wouldn't be much
> >internal difference between "pleasure" and "pain" - just negative and
> positive
> >numbers.
>
> Right. Whereas in humans, the internal difference between pleasure and pain
> is ... er, what is the internal difference in humans exactly?
Pleasure is handled by the medial forebrain bundle... though this is only one
kind of pleasure; "satiation" pleasure is handled by an entirely different
area. I don't recall where the pain center(s) are, but they are *elsewhere*.
So in response, pleasure and pain are handled by entirely different sections
of the brain, feel entirely different from a subjective standpoint, have
wholly different effects, and in general are totally different subsystems.
Ask not how they differ; ask how they are alike.
> >Emotions did indeed evolve because they are evolutionary advantages.
> Although
> >Deep Blue's weightings for piece value may be "evolutionary" in some sense, I
> >don't think the term can really apply in the sense you use it. Linear
> numbers
> >aren't complex enough to "evolve"; evolving sequences of instructions, as in
> >the TIERRA environment, are another matter.
>
> Again, I think that this misses the point. I argue that if humans alter a
> system in a way which leads to greater levels of reproduction for that
> system, then the system has evolved. The agent causing the evolution is
> irrelevant. The code doesn't have to learn from its mistakes in chess.
> There merely has to be an environment which prefers good moves, with
> penalties and rewards that affect reproductive success.
I disagree. Programs designed by humans tend to work one way. Programs
designed by evolutionary computation work a completely different way. This is
what makes the distinction between "evolved" and "designed" programs useful.
The narrowness and genuine distinguishing power of the term is what makes
evolutionary computation a science. What practical purpose is served by
broadening the term as you suggest?
> Precisely. You are interested only in your own emotions. That's all you can
> be. In fact, you ought (I don't mean this nastily at all, but
> scientifically) to be interested in only yourself, because you can't be
> sure that anything else exists.
I couldn't fail to disagree with you less! I can study emotions though
experiment. I can study the neural architectures of the areas of the brain
which subserve emotions. I can study subjective cost-benefit ratios, and how
our causal handling of them differs from Deep Blue's linear branching.
Emotions are REAL and objective, not subjective.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:45 MST