From: Anders Sandberg (asa@nada.kth.se)
Date: Sat Mar 17 2001 - 04:57:31 MST
"Robert J. Bradbury" <bradbury@aeiveos.com> writes:
> In short -- what do you do in the simulation where you get to the point
> where you cannot explore any further without 'offing' someone against
> their will?
...
> Note, there may be a form of "moral" extropianism where you explore
> the phase space as fully as possible, but you do it with as little
> "pain & suffering" as possible. This gets extraordinarily tricky
> as for example would be the case where one want to determine whether
> pain & suffering can drive people to self-enlightenment such that
> they realize pain is something that "they" have and can choose
> to experience in a variety of ways.
I quick sketch (I'm actually off to a seminar on classical greek
philosophy which we will apply to transhumanism in a few minutes :-)
would be that if we just set up extropy as a core value and treat it
as something fungible, then we might fall into the utilitarian trap of
trying to maximize it globally with no concern for each individual
("If it would make most people happy, then I see nothing wrong with
turning all professional philosophers into sausages", as the Swedish
utilitarian Torbjorn Tannsjo said).
But if we accept that extropy has a somewhat diffuse definition that
involves a certain amount of subjectivity, and/or that it is tied to
each mind's experience of growth, freedom and complexity, then we get
somewhere. Maximising extropy cannot be done globally, it is up to
each being (and group of beings) to try to maximise it. When two
beings come into conflict over this (say, Robert and I both want
Jupiter for our respective M-brains) they could of course use force to
settle it, but if I did this it would lead to the decrease of extropy
for Robert (however odd I might find his version of extropy) and also
a significant risk of having the conflict spread (for example by
involving our friends, allies and PPLs) that would lead to a high
likeliehood of less extropy even for me (and others).
One can develop the above argument into a more watertight ethical
system, but it seems to point at a situation where extropianism would
not lead to everybody against everybody trying to get the last scrap
of computronium, but rather everybody trying to work together to find
a way out of the box or at least trade the limited resources.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:27 MST