From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Tue Mar 27 2001 - 05:11:29 MST
On Mon, 26 Mar 2001, James Rogers wrote:
> Any equivalent representation/re-factoring of the program will therefore
> have the same consciousness characteristics of the original (there is more
> than one way to write any program). A lookup table is a perfectly equivalent
> programmatic form, if not particularly space efficient. Therefore, encoding
> any conscious program as a lookup table will still be conscious, trading
> space for speed. How can you dispute this given your premise?
James raises some very interesting points, which must take us back
to ground zero with regard to defining our terms. The discussion
arose from the distinction between zombies and real 'conscious'
humans (if I am recalling this thread correctly; its gotten *very*
long). Now part of the "value" we seem to place on human
consciousness seems to be its variability, creativity
and interactivity. A lookup-table based instantiation of
consciousness (some would assume) lacks some of those apsects.
James takes us further into the "looks, walks, talks" arena
by proposing that it isn't active "computation" (e.g. requiring
CPU cycles) that dictates 'consciousness', but the any *equivalent*
'expression' of the program. (We only need to look at the many
forms in wich the DVD decryption program DeCSS(?) has been
writtein to be very clear that humans can become *very*
creative in this regard!)
So, from this orientation, any "expression" of the *data*
(lookup tables) is equivalent to an executing ('thinking/conscious')
program. (This may have some interesting limits if one considers
whether or not the lookup tables can generate the range of
'behaviors' the executing code can generate).
> What is a "calculation"? I find the distinction between "calculation" and
> "lookup" to be meaningless.
This is a primary point. The space-speed tradeoff in programs
means that you can produce equivalent expressions with very
different memory/cpu-cycle requirements.
However, taken to the limits, this implies that an arbitrary
collection of atoms, organized in a sufficiently complex
way that contains all of the lookup 'states' a 'living' being
will go through in the time they are alive *is* conscious.
(If you want to sharply constrain this, think of a baby born
prematurely that only lives a month -- not many 'behaviors'
are required and the lookup tables would be small. [If you say
the baby doesn't pass the 'mirror' (self)-consciousness test,
simply run it forward until they can pass the mirror test
but die prematurely.])
The end point of this would seem to be the realization that
an inanimate collection of atoms *may* posess 'consciousnes'
(*if* they are properly organized).
> Setting a memory word to zero and XOR-ing the same memory word a thousand
> times doesn't produce different results, just different efficiencies.
> [snip]
Precisely -- so if this approach is valid, then the "efficiency"
of the instantiation of the 'consciousness' is irrelevant -- the
real question is the 'information content' that allows it to
potentially function as a 'human' consciousness or a 'Drosophila'
consciousness.
The consequences of this are interesting -- it means you can never
*assemble* a collection of atoms in a form that constitutes a
lookup-table for a consciousness and then disassemble them
without having committed 'murder'.
Robert
Are your atoms, morally, not even yours?
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:43 MST