From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Fri May 05 2000 - 18:01:16 MDT
White, Ryan writes:
> Assertion 1: I disagree.
Sure.
> Assumption 1: Presently, we are all carbon-based 'persons', that take up
> space.
That was clearly not my assumption. I was talking about strictly
deterministic (pseudorandom, not random) systems in the same state
with stricly identical inputs. Nondeterministically noisy nonlinear
dynamics-systems (strongly divergent state space Hamiltonian) are
absolutely not covered by that.
> Assertion 2: It is therefore relevant to consider copies of both
> matter-based 'persons' and software-based 'persons'.
Copies of matter-based persons are clearly not clones but individua,
and hence irrelevant in discussion context. Only computational models
are covered in that particular case. Since the situation is highly
construed/pathological (synched computational models with identical
inputs) is is quite irrelevant to reality (unless somebody wants to
demo something, that is).
I did not realize that needed to be said, but apparently it wasn't as
obvious as I thought.
> Assertion 3: I don't believe the matter can be addressed satisfactorily
> without introducing definitions of 'self' or 'person'.
>
> Definitions: (IMO), 'selves' and 'people' are processes. They are
> carbon-based (so far) functions. These functions are in a constant state of
Rather, on naturally evolved nanotechnology (self-rep cabable systems
based on linear solvated biopolymers and self-assembled biomolecular
assemblies).
A process is usually a narrowly designed CS term. Physical processes
are a superset of that (i.e. a finite machine falling along a sequence
of states is a CS process. A computer running the above is a physical
system engaged in a process of pushing electrons around, dissipating
heat, producing a few photons, transporting cooling fluid and
generating noise).
> change, on both 'hardware' and 'software' levels. As processes, 'people'
The distinction between hardware and software is not meaningful when
talking about a biological organism.
> are not only defined by their 'present state' (arrangement of matter, mode
> of thought), but also by their 'present external conditions', since a
> significant aspect of their function is to react to their environment. I
Why don't we just describe the organism as a physical system with a
state, and its interaction with environment as I/O?
> consider these 'conditions' to be: the variables (whether they be local or
> remote, accessible to that 'self's' awareness or not) that affect the
> 'person', at that instant in time. Of possible interest is the implication
> that a 'previous self' (the 'person' you were a moment before) both was the
> 'process' and is a 'present external condition'. Why 'external'? Because
> that instant in time is 'external' to the present instant in time. All
> 'previous selves' are thus, in that sense, external to the 'present self'.
Maybe I'm just dense, but you're kinda difficult to understand.
> Justification for Assertion 1: Since a 'copy' of a carbon-based 'person',
> even if physically perfect, cannot occupy the same space as the 'original';
> from the exact moment of bifurcation, the 'copy' and the 'original' are
> subject to differing 'external conditions'. As reactions to the 'external
> conditions' (whether by strict determinism, or by a loose framework of
> patterns based on random events), since the 'conditions' inevitably differ
> thereafter, the 'functions' will diverge in similarity from that time
> forward. Even if both the original and the copy occupy the same room, for
> example, since they cannot occupy the same space, they will each see a
> different perspective of 'the room' around them. Since the input is
> different for each, 'the room' will be processed differently for each,
> potentially invoking different thoughts and different (re)actions. Since
> these (different) thoughts and actions change and further define each
> 'person', and since each 'person' is also a function of it's 'previous
> self', it is safe to assume that they will grow increasingly different as
> time extends away from the moment of bifurcation.
Yes, all very true (in fact internal noise and chaotical dynamics will
make you diverge very soon anyway), but this is not the point of
original discussion. You can't clone macroscopical physical systems in
exactly the same quantum state, and even if you could, you would not
be able to provide them exactly the same input, even ignoring QM
randomness.
> Assertion 3: Things that become different are not the same. ;)
Duh. Sure.
> Questions: While considering the result of copying a 'self' that is not
> based in organic matter (a software-based person), I observe that you make
> the following assumption; the process of copying places both resulting
> 'copies' in exactly similar 'external conditions'. Is this necessarily the
> case? Is this even possible? Can two virtual persons occupy the same
> 'virtual space' at the same time?
Hey, we're talking about a computational system. We set them to be
both in the same state at the beginning of the simulation. The virtual
space thing is strictly irrelevant, as is the physical location of the
hardware boxes (unless you have to make relativistic measurements as
to the state of synch of both systems) have no impact upon that. Same
state, same input, keep clock ticks/iterations synched, that's it.
> Justification for Assertion 1 (virtual context): The suggestion that it is
> knowable whether the 'original' was even copied in the first place (i.e. I
> copied that 'person', there are two now, and that redundancy is going to
> save my ass) implies that there is some context in which the two are
> differentiable. I assert that that contextual difference, whatever it may
Err, what? Of course it is knowable, I have to have access to the full
state of an object during copy, and this implies that during the copy
the object state is frozen/suspended, i.e. the simulation clock is not
ticking. I also need communication channel between these boxes open,
and have to make corrections for synching latency between two physical
systems in areas of differing spacetime curvature. The latter point is
quite irrelevant in practice, when we're talking about two
current-generation computers in physical proximity to each other.
> be, represents a difference in 'conditions', and will therefore potentially
> generate differences among the 'persons'. If the virtual 'copy' is in NO
> way differentiable from the 'original', such as if, as you write, they
> occupy exactly "overlapping state space trajectories", then under what
> conditions could you possibly kill one and not all of them?
You can kill all of them but one without actually destroying
information, provided you ignore the potential future bifurcation
argument. Keeping the clones in the same state is an effort,
particularly if on noisy, fast hardware in remote locations. If the
physical layer is close enough, it wants to make them bifurcate/start
diverging.
If we're a perfectly synched clone, I'd object to other instances of
me being terminated without my consent, since this would rob them from
an opportunity to diverge. In fact, I would probably tell the
experimenter (the callous asshole) that I want other copies of me to
become individuals, and voice my indignation about being subjected to
such an experiment without my consent.
> Ryan v23.9
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:28:26 MST