Re: The nature of obligation

From: Dan Fabulich (dfabulich@warpmail.net)
Date: Wed Oct 30 2002 - 04:04:10 MST


Lee Daniel Crocker wrote:

> The interesting question, then, is how do we, the other sentients,
> handle cases where we have contracted with the not-yet-duplicated me for
> some future plans?

I argue that the answer is that we should treat all the
copies/dupes/forks/etc. as if they acted as a single moral entity. I
don't want to base my argument on the claim that they all *are* a single
entity, actual or otherwise, but rather I just want to show that we should
treat them all like this.

Of course, if you think that all of the entities *are* a single
individual, then you'll have no trouble believing that the individual
should be treated as a single moral entity. My argument is trivial in
that special case; it's a harder fit if you're in the group that doubts
that multiple copy entities are a single individual. Thus it will be
towards this latter group that I'll direct my argument.

If, like some here, you think that there's nothing more interesting about
the relationship between you and a copy than there is about the
relationship between you and your identical genetic twin, then you'll see
my argument as something akin to the claim that we should treat these
special "families" of individuals the way many tribes treat other tribes:
the sin of an individual foreign tribe-member is the sin of his entire
tribe, and vice versa.

Now, we don't normally like such arguments in the modern day. Nowadays we
are mostly opposed to punishing an entire tribe (an entire country?) for
the crimes of one of its members. So why should we take this argument any
more seriously in the case of copies?

The answer, I think, lies in the fact that people could *choose* to make
copies of themselves. To the extent that this is possible, we wouldn't
want to let people get away with forking off a copy to do something that
they don't want to be held accountable for, then reaping the benefits of
their forked copy without paying any penalty. The reason for this has
nothing per se to do with personal accountability, but everything to do
with incentives and consequences: if people knew they could make copies
and use their copies to get away with Bad Things [tm], then we must expect
people to do this more than they might otherwise.

Consider the worst case scenario in a system in which copies are not
responsible for each others' actions: suppose my fork and I walk into a
room with a victim; one of us murders the victim brutally, leaving no
witnesses. Suppose we're brought to trial, one by one, and our attorney
argues: "Look, you can't punish my client, because in each case there's a
reasonable doubt [50% is plenty reasonable!] that he did not commit the
crime." In that case, supposing that I were the sort of person who would
kill somebody if I thought I would suffer no consequences, I'd be
incentivized to bring about a situation like this: I'd benefit greatly
from creating copies as plausible alibis, which could thereby prevent
either copy from being punishable under the law.

This is an obvious radical failure of legacy legal guidelines to handle a
case brought about by new technology... maybe it's so obvious that it
doesn't need mentioning. But the point stands that something similar
*would* happen if we had *any* system in which copies aren't responsible
for each other's behavior, even if people considering whether or not to
fork *knew* that one of their tines would wind up paying the price for his
crimes, rather than my disaster scenario in which neither branch pays the
price. I say: some people wouldn't care; so long as they kind of get away
with it, they'd do it in a heartbeat.

Now, you may think that it would be a *big mistake* to split, allowing
just one of your splits to suffer the painful consequences of his actions
while the others benefit, but even so, it's a "mistake" that you would
expect a rather large number of people to make. There's also the
not-to-be-overlooked point that the system would *incentivize* making this
"mistake", so long as copies would not be held accountable for each
other's actions.

So, if by having a moral/legal rule that says that we shouldn't hold
originals/dupes/etc responsible for the actions of their fellows, we
therefore have a system which incentivizes criminal behavior, then we
shouldn't have that moral/legal rule at all. Instead, we ought to close
off that incentive by making copies responsible for each other's actions,
regardless of their metaphysical/ontological relationship.

-Dan

      -unless you love someone-
    -nothing else makes any sense-
           e.e. cummings



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:17:53 MST