Re: Qualia and the Galactic Loony Bin

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 23 1999 - 00:56:52 MDT


hal@finney.org wrote:
>
> As for the possibility that consciousness gradually fades away, this has
> problems of its own (Chalmers discusses this "fading qualia" example).
> If it is literally a matter of fading, where smells get less intense
> and reds become less red, then the person should notice this and comment
> on it. It is hard to see why he would not say something if he noticed
> his sensory impressions changing.

I don't *know* what would happen if you gradually chopped up the brain.
But, since I am of the opinion that consciousness is a weird physical
phenomenon, I can answer that you *cannot* divide a consciousness in two
and duplicate the I/O, just as you can't chop a singularity (black-hole
type) in half. If you gradually eliminated the underlying mechanisms
for consciousness, the person would notice.

And yes, I know what you're going to say about simulating black holes.
The point I'm trying to make is that only Turing phenomena have to be
divisible into discrete components. Once one steps sufficiently far
outside that, one can conceive of building a physical *thing*, not a
structure, not as the interaction of components, but a substance as real
as a quark, or whatever the bottom level is.

Oh, hell, I don't have the words to describe it. What I'm trying to say
is that I think it's possible to construct complex nondivisible real
things having what we would interpret as cognitive characteristics.
Including qualia, and, one hopes, objective morality. Reality isn't an
is_real() function. Reality is a substance. As long as we think in
terms of is_real() and chains of cause and effect, we'll never even be
able to explain why anything exists in the first place. Sadly, that
Turing ontology is a basic part of human cognition.

Reality physics. You can keep your spacetime engineering and descriptor
theory. I know the toys the Powers play with.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:16 MST