I had written
>>your last 5 minutes of conscious experience is equivalent
>>to the right program going through a few billion states
>>on the cpu or biological organism of your choice.
>>Next, that is equivalent [it is alleged], in turn, to a
>>big lookup table [where] no information flows between
>>states! They are not linked by cause and effect!
At 28 Mar 2001 12:18:00, James Rogers wrote
> This is incorrect, as all you've done is describe a finite state
> machine. You are only considering the lookup table as being a giant
> indexed data repository in the way a dictionary is, without any concept of
> context in the way a finite state machine has context. A lookup table that
> is an equivalent representation of a program contains entries that are data
> and/or code (the distinction being fuzzy) and all finite state machine
> programs are reducible to a lookup table. In the pathological "pure
> lookup" case, each lookup changes the state of the machine which causes it
> to do yet another lookup; in practice, most lookup tables are logically
> reducible to much more compact algorithms, but the concept is still
> valid. To reject the validity of a lookup model is to reject the
> possibility that the brain is a finite state machine.
I do not understand how you are disagreeing. To anticipate:
yes, some lookup table is _functionally_ equivalent to any
program. That is my point. Although functionally equivalent,
or "logically reducible" as you say, it turns out that in this
bizarre case, we do not have moral equivalence. I don't think
that I need another lecture on computation theory, so let me
try to get to the core of our disagreement, if there is one,
this way:
Suppose that God has made an incredible, enormous lookup
table that only returns what James Rogers does in any
millisecond between 2001/03/31/17:50 and 2001/03/31/17:55
given one particular Earth exceedingly similar to ours.
Because the details may be important, I'll provide a few,
but it's very similar to the Eugene Leitl lookup table.
This is a "pure lookup" table, which God can make using
His infinite wisdom.
Let's say that you expect to enjoy that interval of time,
that, for example, during it you get to read the fascinating
conclusion to a murder mystery that's been absorbing you for
weeks. Now if it is known that you will be replaced by the
lookup table at that time---that the mechanical causation
your brain now enjoys, each state being calculated from a
prior state by the laws of physics, gets replaced by this
system in which each state (and a tiny output action) is
rather arbitrarily linked to the next (plus input)---do
you still anticipate the enjoyment of that novel; do you
still expect to be alive or aware during that interval
of time? Yes or no?
>> At each second of the hike, or of your own last 5 minutes
>> conscious experience, we merely form a 30 bit address made
>> from (i) 10 bits of input that you are getting at the
>> current millisecond (ii) 20 bits specifying the state
>> that you are in.
> The addressing is incorrect. You've confused the number
> of clock cycles executed during the last five minutes
> (which is small) with the astronomical number of possible
> states. You could create a hash table of all the states
> used in the last five minutes and discard the rest (so that
> you can meet your bit depth requirement), but then you
> would have a model that was perfectly useless.
What? Perhaps you misread me. Now I do need to apologize
to anyone reading this exchange: I was confusing powers of
ten with powers of two (not a mistake of principle). I will
reword, and try to supply reasonable numbers. I should've said:
At each millisecond of your last 5 minutes' conscious experience,
we merely form a 90 bit address made from (i) 30 bits of input
that you are getting from your physical surroundings (ii) 60
bits specifying your current state. [I forgot that a human
can be in but about 10^20 different states, which corresponds
roughly to 60 bits, not 20.]
> You can't logically say that consciousness must run on a
> finite state machine AND have lookup tables not be able to
> produce consciousness. This is the same as saying that
> [functionally] equivalent programs are not equivalent where
> consciousness is concerned, which I have a very hard time
> swallowing (a=b and a=c, but b!=c?).
Here I think that you understand just how apparently outrageous
what I'm saying is. (Except the algebraic rendition, which is
silly of course.) Yes: two functionally equivalent programs
are not entirely the same; it means something to be one of
them, but not the other. That's why (because of this peculiar
detail concerning lookup tables) I'm not a functionalist anymore.
Lee Corbin
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT