From: Anders Sandberg (asa@nada.kth.se)
Date: Mon Feb 21 2000 - 10:17:01 MST
"Bostrom,N (pg)" <N.Bostrom@lse.ac.uk> writes:
> Regarding what one can morally do to a future segment on oneself, that is a
> trickier issue. You have to specify in what sense this future mind would be
> "you". In the case you describe, with this mind having no memory of who the
> present self is, and no control over what decisions the other direct
> continuations of your present self are making, it seems hard to see in what
> sense this mind is you. In this case, it seems you have no more right to
> coerce this other mind than you would have to coerce a genetic clone of
> yourself.
In this scenario I assumed the future mind would be me minus the
awareness of the experiences being simulation. I don't know if that
really can be viewed as a different person than me, after all the
difference is just a lack of knowledge. Do I have the right of putting
myself through torture? Obviously yes. Do I have the right to
temporarily erase my knolwedge of French Classicist drama and then
subjecting myself to torture? Seems to be very much the same as
before. What if I erase my knowledge of the torture being simulation
and bounded in time and then subject myself to it? What is the
difference here?
> >Maybe this kind of extreme experience cognitive engineering
> is >going
> >to be a popular thing among the posthumans? A kind of dare,
> >passage
> >rite/education or just entertainment
>
> A silly right of passage that would be. Why not do something difficult but
> pleasant instead? And if it is the education we are after, couldn't we get
> that information in a less painful way, say by downloading the memory of the
> pain, or just learning about pain in general. Presumably, you think that the
> pain might somehow "ennoble" us emotionally; - but why not find a way of
> changing our emotional state into that "noble" state - of humility and
> resignation or whatever - directly, rather than go through the inefficient
> and horrendous process of getting to that state through having long series
> of painful experiences. That would be lot more transhumanist in my book!
Perhaps. Remember, this isn't my own idea, I was just repeating it
from a friend (Andreas Svedin). But I think he has a point with his
thought experiment, not just because it involves interesting ethical
questions. Adding new knowledge to a mind isn't just a question of
downloading it, it has to be integrated into the network of
connections that forms our worldview, our personality, our selves
(sorry, no clean RPM modules :-). Just adding memories isn't possible
unless we adopt an extremely different mental architecture. This means
that experiences, real or simulated, will be important to change us in
desired directions. These experiences can involve not just traditional
sensory information but also direct neural modifications, but they
have to integrate new information into our networks.
The way we achieve the end state is not important, I agree completely
with you in that respect. The methods we use can be more or less
crude, and are limited by available resources and technology. Assuming
this pain experience has a worthwhile result (which remains to be
seen) different methods have to be judged by their efficiencies and
perhaps personal aesthetics. I don't see any reason to call one method
more transhuman than the other.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:26:56 MST