From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Sep 21 1999 - 08:03:35 MDT
John Clark wrote:
>
> Eliezer S. Yudkowsky <sentience@pobox.com> Wrote:
>
> >I tend to assume that qualia started out as a spandrel (like
> >bug-catchers becoming wings), then got tied in to reflectivity or the
> >"system bus" that ties the senses together.
>
> That must be true or we wouldn't have qualia, that's also why I think the
> Turing Test works for consciousness as well as intelligence.
I accept a "Chalmers Test" for consciousness. If a computer starts
talking about qualia without being primed - that is, without being told
by the programmer to lie about it and fed the back volumes of _Journal
of Consciousness Studies_ as raw material - then it's conscious. You're
not allowed to fake it, the way you are with the Turing Test, because
the volume of coherent material produced on consciousness is so low that
just spouting it back, ELIZA-like, would be as good as the average human.
> >The mysterious ineffable stuff was probably just a computational
> >speedup - like Penrose's quantum computing, for example.
>
> I have three problems with Penrose:
>
> 1) There is not one scrap of experimental evidence that it's happening
> and there should be if it were true.
Mm, you can argue both sides of that. Penrose & Hameroff claim that
microtubules have several precisely tuned characteristics needed for
quantum coherence. *I* don't think anything ineffable will show up
until we start recording individual neurons *and* we know what to look for.
> 2) The inside of a neuron seems to be far too hot and noisy for
> quantum coherence to be possible, much less quantum computation.
> You'd need new fundamental laws of physics for this to work,
> that's another way of saying you'd need magic and I don't like
> to invoke magic if I don't need to. I don't need to.
Well, we're getting a bit outside my field, but I don't think that's
true. Heat and noise is only a problem for macroscopic, crystalline
(non-stochastic) quantum coherence. Remember that NMR-based QC (*not*
the one we've been discussing lately) that would operate on a cup of
coffee? And the comments about the Improbability Drive?
> 3) If it were true you'd think people would be good at solving some of the
> known non computational problems, that is, problems that can only be
> solved in a time proportional to 2^N where N is the number of elements
> in the problem. However, human beings aren't one bit better at solving
> these sort of problems than computers are, actually computers are
> better at it than people but still, I admit, not very good.
Oh, nonsense. That's like saying people should be good at calculating
the output of neural nets.
> >all else being equal, an ineffable AI is smarter or more efficient than a
> >computational one. It doesn't mean you can't get equally good or better
> >improvements with more computational power or better programming.
>
> Then if you want to make an AI with a certain intelligence, average
> human level for example, it would be easier to make an AI that experiences
> qualia than one that doesn't.
No. If you had to make an AI with humanoid intelligence, using the same
number of computational elements as are in the human brain, with the
same degree of programming intelligence as the neural-wiring algorithm,
it would be necessary to use ineffable computing elements. Bypass any
of those steps...
> That really shouldn't surprise you,
> considering Evolution's experience in building such things, you could make a
> much stronger case that a computer might be able to feel emotions but it could
> never be intelligent.
I have. In fact, as far as I know, I was the first one to make the
evolutionary argument for emotions being easier than intelligence.
> It's our grossly enlarged neocortex that makes the human brain so unusual
> and so recent, it only started to get ridiculously large about 3 million
> years ago. It deals in deliberation, spatial perception, speaking, reading,
> writing and mathematics. The one new emotion we got was worry, probably
> because the neocortex is also the place where we plan for the future.
Interesting. So my Specialist tone, which can be described as "worry"
as much as "sorrow", "frustration" or "despair", is the most
evolutionarily recent? Cool, if true.
> If nature came up with feeling first and high level intelligence much later,
> I don't see why the opposite would be true for our computers. It must be
> one a hell of a lot easier to make something that feels but doesn't think than
> something that thinks but doesn't feel.
What do emotions have to do with qualia? You're simply conflating two
uses of the term "feel".
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:14 MST