Re: qualia

From: jeff nordahl (jnordahl@hotmail.com)
Date: Thu Dec 09 1999 - 23:30:45 MST


AI's without qualia will be strictly driven to achieve tasks successfully
and efficiently. Get from point A to point B in the quickest time. They
will be our slaves and will have no objection to it because without qualia
they cannot be rewarded or punished for their activities, thus will be
indifferent.

Qualia to me seems to be the experiencing of our own neuro-chemical
reactions (programmed and learned) that are launched by various perceprions.
  For example: when I see bright orange, I feel excitement which can be
associated with adrenalin. In nature, instinct has pre-programmed that
orange things are often dangerous, poisonous fish, fire, etc. So my initial
qualia for orange is a rush or reflex of temporary danger (adrenalin
excretion). Upon later inspection I may detect that the orange object is a
pumpkin. The bright orange got my attention through danger association, but
now the orange pumpkin launches a warm festive memory of Thanksgiving with
my family. The danger qualia has now subsided and transformed into a
dopamine excretion of pleasure while I remenisce about my safe loving
family. etc.etc. The same qualias can be applied to sounds and associated
instincts and memories (rattlesnake rattles and pounding drums). It seems
to me that qualia is the rebounding reactions of instinct and memory with
their associated chemical reward and punishment (safety and danger)
sensations.

Thus, AI's will only achieve qualia if we can devise a means to reward and
punish them / scare them or make them feel secure. How to do this is a
mystery to me, would rationing out electricity as treats like we do with
fish when training dolphins be a motivator?

Also, what would we use AI's for anyway? For comradery or as slaves
(labor.) If you want an AI buddy to chat with, give it qualia. If you want
an AI slave, qualia is the last thing you'd want included in the
programming. Qualia is a distraction to getting the job done!!!

Jeff Nordahl

>From: Brent Allsop <allsop@fc.hp.com>
>Reply-To: extropians@extropy.com
>To: extropians@extropy.com
>Subject: Re: qualia
>Date: Tue, 30 Nov 1999 17:17:21 -0700 (MST)
>
>John K Clark <jonkc@att.net> asked:
>
> > Then will somebody please explain to me why evolution ever came up
> > with it!! Even a hint would be wonderful advance.
>
> Right off the top there are two real obvious reasons. First
>Phenomenal qualia are fantastic fore representing information. An
>abstract "1" or "0" have minimal diversity while the various different
>qualia are way diverse. Can you imagine how confused our conscious
>mind would be if the smell of a flower was anything like warm and
>red...? We'd become very confused very fast just as mathmaticians and
>computer programmers are very limited in the amount of 1s and 0s they
>can keep in their head at once.
>
> I bet when we discover what qualia are and how the brain uses
>them to consciously represent different kinds of information
>artificial intelligence will make huge leaps and bounds using such
>phenomenal and robust information representation technologies. You've
>got to have lots of inefficient abstract ones and zeros to get
>anything close to what qualia can model.
>
> But an even more compelling reason might be motivation. How
>motivated are current abstract AI programs? It takes a lot of ones
>and zeros carefully crafted into complex and usually brittle control
>logic to come up with any kind of artificially motivated behavior
>doesn't it? But the phenomenal qualities of qualia are fundamentally
>mostly motivational. Joys are constructed of qualia and these are
>mostly what makes us want what we want with so much passion. Sure,
>you can simulate such abstractly, but I'm betting that evolution uses
>qualia because they are fundamentally and robustly motivational. Take
>a "tired" sensation for example. I think the fact that things get
>harder for us to do as we get tired is a very natural thing for
>qualia. Just try to model the same kind of you should slow down
>behavior with abstract logic. It gets complicated very fast. Sure
>it's possible, but is it as easy, especially for nature or evolution?
>Commander Data on Star Trek wants to experience pleasure because
>without pleasure there is no purpose or reason to life is there?
>
> How do these arguments sound? Now that I've expressed them
>they don't sound all that bullet proof. What do you all think?
>
> Brent Allsop

______________________________________________________
Get Your Private, Free Email at http://www.hotmail.com



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:06:02 MST