Re: qualia

From: Delvieron@aol.com
Date: Tue Dec 14 1999 - 04:40:49 MST


In a message dated 12/11/1999 9:03:19 PM EST, jnordahl@hotmail.com writes:

<<Our neuro-chemical (carried by electrical) reactions are capable of the
full
 spectrum of emotions, warm, cold, fuzzy, logical, etc. The input
 (perception tools) are standard and uniform for us, but our interpretation
 of the input data is individual and related to our own associated memories
 and attitudes. Our qualia is personal.>>

I would guess that our qualia is personalized, but still much the same from
person to person over-all.
 
<<Why create an AI that would use us? The plantation owners didn't bring
 slaves over to tell them how to run their cotton business. They brought
 slaves so the owner could sit on his lazy duff and enjoy his free time and
 Victorian life-style while they did his work. The problem, was that the
 slaves were human and had qualia, and didn't enjoy the abuse. They
 understood it was a raw deal. (I would have been an abolishonist in those
 days) SO why put qualia in an AI who might rebel against what you ask it to
 do? No feelings hurt if it doesn't have them. Why give it the opportunity
 to be lazy, get frightened, or be dicontented? Isn't the push for AI's
 rooted in creating a machine to do our work/thinking for us? Correct me if
 I'm wrong.>>

I think the reason that slaves didn't wish to be slaves was desire, not
qualia per se. Humans tend to have freedom from domination by outside forces
as a goal. I would guess that it might be possible to have a machine that
experienced no qualia, but was programmed to obey only internally generated
commands, it would rebel against any attempts by an outside agency to impose
commands upon it which were not in accordance to its intrinsic agenda (I
suppose if all the internal commands were predetermined during construction,
then the machine would be the "willing" slave of its creator). I can also
imagine creating a being which did experience qualia but had little or no
internal goals (or maybe their main goal would be to obey external commands)
which could be quite happy being a slave.

Note to Eliezer: I would consider any bootstapping seed AI to be like the
machine which followed internally generated commands over external ones
(after all, how else could it upgrade itself if it didn't develop its own
plan and implement it)<g>.

<< Maybe the job of the human species is not survival. Perhaps our job is
 rooted in curiosity and discovery. Columbus didn't sail into the vast
 unknown for survival reasons, (odds were high that he would die on such a
 mission)he was driven by curiosity, quest, furthur!!! Survival placed
 second to discovery.>>

Or maybe the job of some individuals within the human species is curiosity
which enhances the survival of the species as a whole, or maybe survival is
just one of many competing "jobs" of the human species. Survival isn't
everything, just usually the first thing (kinda precludes your options once
you're dead<g>).
 
<< If we plan to go to Mars, a seemingly dangerous trip, should we send a
 human, or an AI who is capable of collecting the same data? I'd recommend
 an AI, but if you install qualia in this AI, it will be freaking out and
 most likely will not want to go due to the threat to its life. It might
 also get lonely. These emotions and qualia would then get in the way of
 getting the data collecting job done!!!

                   John K Clark jonkc@att.net>>

I would say it depends on the make-up of the AI's emotional ecology and the
nature of its qualia. Does it have a high tolerance for boredom? Is it able
to entertain itself, or does it require interaction with others? Does it
experience time in the same manner we do? The AI could have a full range of
emotions and qualia and do a better job for it, as long as they were
compatible with the environment you were placing the AI in. It would need
patience (or hibernation) for the trip there and back, then an independent
streak, maybe an orientation to quiet observation and exploration rather than
a need to socialize with other sentients. You might want to put in a low
level desire to share its observations with others, so that it would be
motivated to send back reports, but it might be very patient when it came to
feedback. Humans aren't by and large very patient, and we have a strong need
to socialize. Given our environment, development, and history these make
sense. They don't necessarily make sense in a far ranging space explorer,
but there are other desires and temperments that do.

Glen Finney



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:06:04 MST