From: Bostrom,N (pg) (N.Bostrom@lse.ac.uk)
Date: Tue Feb 22 2000 - 17:21:19 MST
This should have gone to the list a couple of days back but it only went to
Anders (courtesy mischievous reply-to-sender button). It will be interesting
to think more systematically about these problems when I get time - it seems
quite complex, as several people have remarked.
(BTW, we did the studio debate today for the BBC programme on the ethics and
feasibility of life-extension. Don't know exactly when it's going to be
aired yet.)
Anders writes:
>I assumed the future mind would be me minus the
>awareness of the experiences being simulation. I don't know
if that
>really can be viewed as a different person than me, after
all the
>difference is just a lack of knowledge. Do I have the right
of putting
>myself through torture? Obviously yes.
As long as the decision-maker and the pain-experiencer are "the same", there
does not seem to be a moral problem.
But same in what sense? Same as in "same consciousness"? That doesn't seem
to work, because we want to say that my present time-segment can morally
choose to give pain to my time-segment of tomorrow, although my present
segment cannot feel the pain. Or maybe we assume that there is no moral
problem in this case only because we have evolved to care about our future
segments so the problem doesn't arise much in the real world. So for
practical purposes, the present time segment can be the coercing guardian of
your future time segments. But I wonder how this might change if we
construct "perverse" beings who don't have this degree of care for their
future selves. If a present segment doesn't care about a future segment, do
we still want to say that they both belong to the same person, provided only
that they have a lot of memories in common? Do we want to give coercive
rights to the earlier segment over the later segment simply because it
happens to be at an earlier temporal position? This is not at all obvious,
and well worth thinking about further.
Imagine the political campaigns... "One segment, one vote!", "End temporal
discrimination now!"
>Perhaps. Remember, this isn't my own idea, I was just
repeating it
>from a friend (Andreas Svedin). But I think he has a point
with his
>thought experiment, not just because it involves
interesting ethical
>questions. Adding new knowledge to a mind isn't just a
question of
>downloading it, it has to be integrated into the network of
>connections that forms our worldview, our personality, our
selves
Sure, in general I agree with that. But when it comes to "knowledge of
pain", well, it doesn't seem to contain a whole lot of information content
(the pain, as opposed to the various sensations that might accompany it).
Rather, I suggest, the people who talk about being wiser after having
experienced a lot of pain, are referring to emotional changes that have
taken place, not so much bits of information that have been added to their
network. And it might be much more feasible to bring about these emotional
changes directly (maybe through chemicals or gene therapy) than it would be
to download "knowledge of thermodynamics", for instance.
Nick Bostrom
Dept. Philosophy, Logic and Scientific Method
London School of Economics
Email: n.bostrom@lse.ac.uk <mailto:n.bostrom@lse.ac.uk>
Homepage: http://www.analytic.org <http://www.analytic.org>
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:26:58 MST