Re: Emotions: The Easy Part

From: Darren Reynolds (extro@blue.demon.co.uk)
Date: Sat Aug 16 1997 - 12:37:41 MDT


At 21:52 13/08/97 -0500, Eliezer S. Yudkowsky wrote:
>Finally, although this is getting into the hard problem of conscious
>experience, I don't think the physiological stuff can explain the subjective
>aspect. Is adrenaline conscious?

Yeah, but you can say that about "consciousness" itself. The topic is
wandering off into the mist a bit here, but I do believe that the "I" in me
is an illusion; I'm not really there. It's the only way I seem to be able
to explain away reality. Not very satisfactory, I know, but until someone
offers me something better ...

>Emotions aren't mysterious forces. They can't hide from us. Turing machines
>are deterministic and wholly causal; any force operating on Deep Thought
would
>be explicable as the sum of its parts.

Obvious question: why do you think we are any different?

>> When Deep Blue won the series recently, I wondered whether it felt a most
>> primitive sense of achievement, in the same way that our own primitive
>> ancestors might have felt a sense of achievement if they had lifted their
>> first stone, and successfully hurled it at a passing antelope. Yes, the
>> thing is just a mathematical algorithm. But so, apparently, are we.
>
>It didn't. Deep Blue may have emergent chess stratagems from the complex
>physical-process-level chess exploration algorithms. I don't think there was
>any form of feedback for successful and unsuccessful moves. In training
mode ...

Yeah, but this misses the point. There IS feedback in the form of
selection. If Deep Blue makes bad moves, IBM will trash it. There doesn't
have to be anything in the code. It's the same feedback (and the only
feedback) that our own evolutionary path had until very recently.

Forget the physical substrate, and think only of the whole logical system:
inputs and ouputs. If one system produces all the same outputs for a given
series of inputs as another, then the systems are identical. If they don't
consist of the same stuff, that doesn't stop the SYSTEMs being identical.
One could be plastic, silicon and Cobol whilst the other is water, fat and
DNA.

A fun discussion this. :o)

>Except from a functionalist perspective, there wouldn't be much
>internal difference between "pleasure" and "pain" - just negative and
positive
>numbers.

Right. Whereas in humans, the internal difference between pleasure and pain
is ... er, what is the internal difference in humans exactly?

>If Deep Blue were reversed to make the worst possible move, it might
>not require much reprogramming... i.e., the distinction between pleasure and
>pain would be surface, rather than deep causal.

Hmm, that's a good point. But it is possible that this may have more to do
with our level of understanding of the system rather than the depth of the
distinction.

>Emotions did indeed evolve because they are evolutionary advantages.
Although
>Deep Blue's weightings for piece value may be "evolutionary" in some sense, I
>don't think the term can really apply in the sense you use it. Linear
numbers
>aren't complex enough to "evolve"; evolving sequences of instructions, as in
>the TIERRA environment, are another matter.

Again, I think that this misses the point. I argue that if humans alter a
system in a way which leads to greater levels of reproduction for that
system, then the system has evolved. The agent causing the evolution is
irrelevant. The code doesn't have to learn from its mistakes in chess.
There merely has to be an environment which prefers good moves, with
penalties and rewards that affect reproductive success.

>Deep Blue, again, is not that level of AI.

It doesn't have to be.

>So if I have a program,
>
>struct Wasp {
> float wing_tone;
> float air_speed;
> float sting_angle;
>};
>
>and I adjust the three variables you mentioned, the program has emotions?

I guess we'd have to ask it. How can you tell if I am angry?

>Any emotions
>that are observer-dependent, I am not interested in.

Precisely. You are interested only in your own emotions. That's all you can
be. In fact, you ought (I don't mean this nastily at all, but
scientifically) to be interested in only yourself, because you can't be
sure that anything else exists. There are two types of entity: you, and
everything else. Other humans, dogs, wasps and computer programs are all
"everything else", and you really can't say very much about them. Your own
body probably falls into the category of "everything else", too, which is a
bit unnerving.

What do you think?

Regards,
Darren



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:44 MST