Eugene Leitl writes:
Heck, you are your clone. The clone is you. The act of
knowing it
changes nothing. You can't tell the clone whether it is going to be
terminated, or not, because the act of telling would cause a
bifurcation. So you don't know whether you or not-you is going to
meet
the /dev/null in the sky. Even better, you don't have a 50% chance
of
dying, because no one dies as long as one copy of the clone crowd is
still present. There is no difference between you both. Both state
space trajectories overlap perfectly. If one clone is terminated,
the
trajectory still continues.
The whole discussion is about as meaningless as whether it is better
to be dead, or alive. The comparision is invalid, because death is a
non-state. If there's no you, you can't decide.
Monkey business. Playing with words. Getting trapped in mental
states. A new iteration of Achilles/tortoise pseudo-dilemma.
Assertion 1: I disagree.
Assumption 1: Presently, we are all carbon-based 'persons', that take up
space.
Assertion 2: It is therefore relevant to consider copies of both
matter-based 'persons' and software-based 'persons'.
Assertion 3: I don't believe the matter can be addressed satisfactorily
without introducing definitions of 'self' or 'person'.
Definitions: (IMO), 'selves' and 'people' are processes. They are
carbon-based (so far) functions. These functions are in a constant state of
change, on both 'hardware' and 'software' levels. As processes, 'people'
are not only defined by their 'present state' (arrangement of matter, mode
of thought), but also by their 'present external conditions', since a
significant aspect of their function is to react to their environment. I
consider these 'conditions' to be: the variables (whether they be local or
remote, accessible to that 'self's' awareness or not) that affect the
'person', at that instant in time. Of possible interest is the implication
that a 'previous self' (the 'person' you were a moment before) both was the
'process' and is a 'present external condition'. Why 'external'? Because
that instant in time is 'external' to the present instant in time. All
'previous selves' are thus, in that sense, external to the 'present self'.
Justification for Assertion 1: Since a 'copy' of a carbon-based 'person',
even if physically perfect, cannot occupy the same space as the 'original';
from the exact moment of bifurcation, the 'copy' and the 'original' are
subject to differing 'external conditions'. As reactions to the 'external
conditions' (whether by strict determinism, or by a loose framework of
patterns based on random events), since the 'conditions' inevitably differ
thereafter, the 'functions' will diverge in similarity from that time
forward. Even if both the original and the copy occupy the same room, for
example, since they cannot occupy the same space, they will each see a
different perspective of 'the room' around them. Since the input is
different for each, 'the room' will be processed differently for each,
potentially invoking different thoughts and different (re)actions. Since
these (different) thoughts and actions change and further define each
'person', and since each 'person' is also a function of it's 'previous
self', it is safe to assume that they will grow increasingly different as
time extends away from the moment of bifurcation.
Assertion 3: Things that become different are not the same. ;)
Questions: While considering the result of copying a 'self' that is not
based in organic matter (a software-based person), I observe that you make
the following assumption; the process of copying places both resulting
'copies' in exactly similar 'external conditions'. Is this necessarily the
case? Is this even possible? Can two virtual persons occupy the same
'virtual space' at the same time?
Justification for Assertion 1 (virtual context): The suggestion that it is
knowable whether the 'original' was even copied in the first place (i.e. I
copied that 'person', there are two now, and that redundancy is going to
save my ass) implies that there is some context in which the two are
differentiable. I assert that that contextual difference, whatever it may
be, represents a difference in 'conditions', and will therefore potentially
generate differences among the 'persons'. If the virtual 'copy' is in NO
way differentiable from the 'original', such as if, as you write, they
occupy exactly "overlapping state space trajectories", then under what
conditions could you possibly kill one and not all of them?
Ryan v23.9
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:10:31 MDT