Nick Bostrom wrote:
>
> We presumably want to say
> that only the former gives rise to a consciousness. And the relevant
> difference seems to be that in the case of the process, the various
> states are causally connected, whereas with the spatial pattern that
> is not so.
This is almost exactly the point where I threw up my hands and declared for Penrose. The problem is that "causal connections", insofar as far as I can understand them, are intrinsically counterfactual; the problem with counterfactuals is that they are entirely subjective. Although I continue to try, I have not been able to devise a theory of instantiation eliminating either of these problems.
Now let's say we run the playback, and, at each step, consult the original state transition diagram to find out what the results *would* have been, but then discard that result and load in the tape. In other words, we compute each step, at each point along the recording, but we don't connect them causally to each other; we discard the result and load the recorded step, even though the two happen to be identical. Is that Turing machine conscious? Does it make a difference whether the record was generated by quantum randomness, by a methodical generation of all possible records, or by actually recording an identical Turing machine?
Let's then say that we generate each step, compare it to the record, and, if they differ, replace the step with the data from the record. If the record is different in the slightest, it will prevail over the Turing machine. But, since data and process are identical, the Turing process continues untouched. Is that Turing machine conscious? What is the precise difference between that and the previous Turing machine?
How about if the data is loaded from the record, compared to the intermediate step, and the intermediate step used if it's different? (Call this "machine Q".) That's definitely conscious, right? But that would make use-the-record-if-different unconscious, right?
Since the intermediate step is never actually used, isn't that the same as discarding it? No? Yes? Why?
When we say that the Turing machine enters state X because it encountered a 1, we mean that it would have entered state Y if it had encountered a 0. If a 0 would also have led to state X, we would say that it moved Left because of the 1, because a 0 would have made it move right. If both a 0 and a 1 lead to the same state, we say it's being insensitive to the data.
Let's consider a Turing machine in which the state transition diagram is a series of double boxes; each 1 box and 0 box contains a little scrap of paper instructing the Turing machine what to do next. There's a Chinese restaurant owner that runs around following the instructions inside the box, moving counters on an infinitely long strip, keeping track of which double box is the current state, and generally implementing the Turing machine.
There's also a demon scurrying around that occasionally takes out one strip, temporarily replaces it with a duplicate of a strip from the other box (in the double-box set), and then switches it back after a random time. Just for fun, we'll say the demon is quantum-random.
If the demon ever tampers with a box at the time the CRO (Chinese restaurant owner) is looking in it, and the instructions are different, then obviously the Turing machine will malfunction. And if the demon deliberately watches the CRO and only replaces boxes that he never uses, it seems intuitively obvious that the Turing machine is instantiated. But what if, through pure quantum randomness, the demon replaces only unused boxes while the CRO is reading the other one? Has the causal connection been broken or not?
It seems to me that a theory of instantiation would have to dispense with such counterfactuals entirely and rely on pure continuity of isolated data, but I don't see any way to create such a theory and bind it to physical reality. And note in particular that such a theory looks like it would make machine Q unconscious, which seems unintuitive in the extreme.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/singul_arity.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.