From: John Stick (johnstick@worldnet.att.net)
Date: Tue Jul 24 2001 - 13:57:30 MDT
Emil Gilliam wrote:
What if:
(1) The mind does something noncomputable (in the Turing sense),
thanks to as-yet unknown physics;
(2) This noncomputability lies at the heart of qualia; and
(3) We must explicitly take advantage of this feat of physics in
order to produce an artifact capable of running "real AI"?
My first problem with this argument is that (2) is unnecessary and a red
herring. Qualia are sensory data as presented to consciousness. I assume
the noncomputabilty arises in the consciousness aspect and not sense data
per se-- we already know a lot about how the brain processes visual images
and noncomputability has yet to rear its head. But consciousness of sense
data is not necessary to real AI, if by real AI we mean ability to write and
improve programs, or do math or translate languages.
More abstractly but more realistically, one might ask whether
consciousness in general, or even self-consciousness, is necessary to real
AI. The short answer is: noone has a clue, because noone has the faintest
idea how consciousness arises from our brain activity. Philosophers try to
generate arguments that consciousness is essential or not, or that it
naturally arises from any cognitive activity or not, but the arguments on
either side that I've seen, such as Searle on the one hand and Dennett on
the other, beg all the important questions before they are barely started.
But assume that Eli's creature, or Ben's webmind, never develops
consciousness, as far as we can tell; they just do all the intellectual
tasks we are dreaming of. Will that slow the arrival of the singularity?
Not so far as I can see (though it would make uploading a less enticing
prospect).
The general question Emil is asking, it seems to me, is the same old big
question: is general intelligence computable? Bringing qualia or even
consciousness into it doesn't seem to sharpen the question at all ( in the
way that Jerry Fodor's concerns, that abductive reasoning, being holistic,
will be so massively difficult as to be effectively uncomputable for a long
time, at least point to a more focused problem). But the only real answer
to the general concern is to say we won't know til we try.
More tentatively, however, I must say that the idea that the problem
will be hardware (unknown physics in the brain that is in some mysterious
way uncomputable) rather than software (Fodor's worry) seems to me most
unlikely. It seems to me a fundamental confusion of categories. Once you
find a "new physics" phenomenon that is uncomputable, if you ever do, I will
bet the uncomputability will be able to be manifested in a substrate
employing only the old physics as well. Uncomputability is ultimately a
mathematical phenomenon, not a phvsical one, and it will be independent of
the details of brain chemstry or physics.
John Stick
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT