From: Zeb Haradon (zharadon@inconnect.com)
Date: Sun Dec 12 1999 - 00:24:21 MST
-----Original Message-----
From: Dan Fabulich <daniel.fabulich@yale.edu>
To: extropians@extropy.com <extropians@extropy.com>
Date: Saturday, December 11, 1999 9:10 PM
Subject: Re: q***** (and incorrigibility)
>
>Actually, I don't agree with the way you're using the word "thinks." I
>use the word in the functionalist sense: you're "thinking" if you're
>exhibiting the right sort of functional properties. You seem to be using
>the word in some "deeper sense."
>
>But you claim that the fact that you're Thinking, in whatever sense of the
>word you're using, shows that you must not be a zombie, and that this
>disproves my argument. But you DON'T have absolute knowledge that you are
>Thinking. You could be "thinking" (in the functional sense) but not
>Thinking in the spooky sense. How do you Know that you're Thinking when
>you could just "think" that you're Thinking?
>
if I was just thinking that I was Thinking, I wouldn't have the qualia
associated with it, which I do.
We seem to be arguing in circles. My position is that psychology and
philosophy have not yet developed a definition of cognition sufficent enough
to encompass consciousness and qualia. The sights, sounds, smells, and
thoughts that I experience are something more then the functional account of
the physical reactions which constitute my perception of them, as those
physical reactions are currently explained by science. There is an
experience of seeing blue, which is more then what we know of the physical
process of a path of neurons firing from the eye to the visual center. Do
you deny this?
Presumably thinking causes Thinking, or they are the same phenomena
experienced from different angles. It would really surprise me to find out
that the consistent co-occurrence of certain brain states with certain
mental states is a coincidence.
>This is not an idle skeptical concern. I really DON'T think that you're
>Thinking. I think that you're "thinking."
but do you think that you're Thinking? Do you have experiences?
> If you're Thinking, then we as
>scientists and philosophers have a difficult problem on our hands,
no shit :)
It's a very difficult problem
> and it
>casts some concern on whether uploading, AI, and a number of other handy
>technologies are really possible. If you're "thinking," then we've got no
>problems at all.
>
It doesn't cast much doubt on them - you don't need to know HOW something
works to know THAT it works, and I have a good argument to show THAT an
agent functionally equivalent to me should be having the same conscious
experiences.
I am pretty sure that anything functionally equivalent to me would be
conscious, as I am. My argument is this: given that being conscious, by
itself (as distinguished from the hypothetical unconscious 'zombie' which is
100% equivalent in behavior), serves no evolutionary purpose (how could it,
as a zombie with the exact same behavior would have the exact same rate of
survival), then we have to conclude that consciousness was not "engineered"
into us by evolution. That is - there was never a point where there was a
mutation among the zombie population which led to a conscious individual,
whose progeny have since reigned over the gene pool, except if by
astronomical coincidence. If consciousness offers no survival benefit, then
why do we have it? It must be a by-product of something. There are two
possibilities: 1) consciousness is a by-product of the physical stuff which
makes us up, or 2) it is a by-product of the proccesses which that stuff
goes through. I think we can discount #1 in favor of #2, by the fact that
changes in neuronal proccess result in changes in consciousness. However, we
have a lot of evidence that a different physical chemical which performs the
same function as a brain chemical will result in similar brain states. For
example - the morphine molecule is physically different from the endorphin
molecule, but it binds to the same site (i.e. they are functionally
equivalent), yet it produces effects similar to those of the endorphin
molecule. This is not a perfect example, since it binds longer, but I'll bet
that a molecule which perfectly mimiced endorphins would result in perfectly
indistinguishable effects. The point: consciousness must be a by-product of
cognitive function. Implants into my brain which fullfilled functions
equivalent to the parts they replace should not result in any difference in
conscious experience.
As for AI - who cares whether an AI is conscious or not? I wish it were
possible to make inconcsious AI, they could be used as slaves with no moral
problem.
--------------------------------------------------------------------------
Zeb Haradon
My personal website:
http://www.acsu.buffalo.edu/~haradon
A movie I'm directing:
http://www.elevatormovie.com
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:06:03 MST