From: hal@rain.org
Date: Mon Apr 05 1999 - 09:51:57 MDT
Darin Sunley, <umsunley@cc.umanitoba.ca>, writes:
> It seems to me that thsi whole debate becomes a littel cleaarer when
> stated in terms of the ontological levels of the agents involved.
This is an interesting way to look at it, but the specific analysis
you present doesn't address the examples we have been discussing.
> Isn't the whole idea of a Turing test that it be done between agents on
> the same ontological level? When we program a computer to attempt a
> Turing test we are equipping it with sensors and knowledge about our
> ontological level, and the ability to communicate to our ontological
> level. We are, in short, attempting to keep the computer's processing in
> the same ontological level as the hardware, instead of doing processing
> in a domain one ontological level down.
I don't think it would necessarily have to be done that way.
Theoretically one simulated being could question another at the same
ontological level. Suppose we have a thriving AI community, generally
accepted as being conscious, and a new design of AI is created.
Suppose they all run a million times faster than our-level humans,
so that a Turing test by one of us against one of them is impractical.
The other AIs could question the new one, and all are at the same level.
> Consciousness is a label assigned by one agent to another.
I would say, rather, that consciousness is an inherent property of some
class of agents. Some agents may believe that others are conscious or
not, but they may be mistaken.
> Postulate an ontological level containing two agents, each of whom
> believe the other to be conscious. Let them make a recording of their
> interactions, to the greatest level of detail their environment allows,
> to their analog of the Heisenberg limit. Let one of them program a
> simulation, containing two other agents. Neither of the first two agents
> believes that either of the agents in the simulation is conscious.
Your intention is that the simulation replays the interaction between
the original two agents? This is different from what we have been
considering, because the recording in your example is effectively
one level down from the original interaction. From the point of view
of the agents, the original interaction occured in "the real world"
but the recording is taking place in some kind of simulation.
What we were discussing (as I understood it) was the case where the
recording was of an interaction one level down from us. We would then
play back the recording in some kind of machine, also one level down
from us. So there is no difference in the levels between the original
interaction and the playback.
You have introduced an extra variable: not only must we distinguish
between recording and playback, but also we now have to contend with
issues introduced by having the playback be at a lower level than the
recording. This complicates the problem and will make it more difficult
to identify the fundamental issues.
> From OUR point of view, one ontological level up form these agents,
> neither seems conscious. Both are, from our point of view, completely
> deterministic, and we see no menanigful distinction between them and
> their recording.
Why would we see these agents as unconscious? Just because they are
deterministic? Where did that come from? There's no reason I can see
that determinism rules out consciousness. Some might say that it rules
out free will, but even if they don't have free will they could still
be conscious, couldn't they?
> From THEIR point of view however, the recording seems dramatically less
> conscious then they are. Both agents in the original level would pass
> Turing tests adminstered by the other. Neither of the agents in the
> recording would pass a Turing test adminitstered by agents from the
> original level.
Not everyone would agree that the recording is less conscious than them,
just because it is one level down. That is the issue we are trying to
understand better. Some people might argue that recordings are equally as
conscious as the original, no matter what level they are instantiated in.
Not everyone would hew to the strict Turing test you are using here.
Hal
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:29 MST