From: Emlyn (emlyn@one.net.au)
Date: Tue Mar 27 2001 - 15:14:06 MST
> Robert J. Bradbury wrote:
>
> > The end point of this would seem to be the realization that
> > an inanimate collection of atoms *may* posess 'consciousnes'
> > (*if* they are properly organized).
>
> Why do they have to be properly organised?
>
> Why require the atoms at all?
>
> BM
>
>
Absolutely. You are just requiring an isomorphism between some given
conscious brain, and something else; a sufficiently complex mapping should
be derivable which would map any state of anything to any state of anything
else.
Interestingly though, the mapping itself will contain all the necessary
detail of consciousness in a truly unlikely mapping; it'll be hard-coded in.
At this point, we must question the usefulness of this idea; it degrades to
"we are everywhere and in everything", because anything can be isomorphic to
anything else. This is pointless; we do seem to have, in practise, some
kinds of limitations on what is what and who is who.
So consciousness, if it is anything, wouldn't seem to arise from any
isomorphism to a "conscious" system, given that that allows every
conceivable state of any combination of matter/etc? to encode that
consciousness (you just need to know the right mapping).
Hence we need to restrict the level of isomorphism. What kinds of mappings
maintain consciousness, and what types lose it?
(I must quickly interject here and say that the mappings aren't between two
static snapshots of systems; they are between two systems changing over
time; a brain on the one hand, and any other arbitrary collection of stuff
on the other).
Does this line of reasoning strike anyone else as problematic?
Emlyn James O'Regan - Managing Director
Wizards of AU
http://www.WizardsOfAU.com
emlyn@WizardsOfAU.com
"Australian IT Wizards - US Technology Leaders
Pure International Teleworking in the Global Economy"
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:43 MST