Harvey Newstrom, <newstrom@newstaffinc.com>, writes:
> The replay is not conscious. To get the replay to work, the Turing Test
> administrator has to do the exact same test. If the question is delayed a
> second, the "brain" will answer a question that was not asked. If the
> speaker to the "brain" burned out, it will answer without hearing the
> question. If the questions are not identical, the wrong answers will be
> given. The really-conscious brain would correctly react to these
> situations. The fake-conscious brain will fail to react.
Are you saying that if you allow a brain to run normally then it is conscious, *except* if it happens to be given the same inputs that it had on an earlier run, then it is not conscious?
Or would you say that a brain is conscious any time it is functioning normally and processing information, even if it happens to be the case that it is repeating an earlier run? If so, you can then go and read Emlyn's and Eliezer's messages and see what you think of them.
> I would say that the second Turing Test is invalid, because the tester must
> cooperate to make the test work. The whole point of the Turing Test is that
> the testor cannot tell that the subject is not human. In the replay
> example, the Testor not only can tell, but must carefully craft the
> questions to make the subject appear to answer. Any failure on the part of
> the Testor will make the fake-brain fail. But the same failure on the part
> of the Testor to a real brain will not make the real brain fail.
Hal