O'Regan, Emlyn wrote:
> I'm sure that consciousness is entirely bound up in the way we operate,
but
> why is it necessary? I read Chalmers (Facing up to the problem of
> consciousness - http://ling.ucsc.edu/~chalmers/consc-papers.html) on this,
> with his theory of information having a functional and a phenomenal
aspect.
> I like it, but I still can't see what the point is of the phenomenal
aspect,
> and I think that there is a point to consciousness. I also agree with the
> Zombike objection to Chalmers' reasoning.
I highly recomment Douglas Hofstadter's "Godel, Escher, Back: An Eternal Golden Braid" for its treatment of this topic. His theory (in very simplified form) is that consciousness is a natural result of the inherent complexity of a sentient mind. To make a system that can act in an intelligent fashion you have to build in all sorts of low-level self-reference, in the form of programs that modift other programs within the mind. Once the whole system reaches a certain level of complexity (a "critical mass" for sentience) these self-referential systems become sufficiently tangled (and therefore sufficiently complete) to support consciousness. Thus, making a human-level intelligence that is not conscious would be an exceedingly difficult project, and it may turn out to be impossible in principle.
Of course, all of this is still untestable, but he makes a much better argument than most people in this field ever manage.
Billy Brown, MCSE+I
ewbrownv@mindspring.com