From: Anders Sandberg (asa@nada.kth.se)
Date: Thu May 31 2001 - 03:08:14 MDT
torsdagen den 31 maj 2001 10:37 Robert J. Bradbury wrote:
> This looks like its going to be interesting...
> So Emlyn wrote:
> > I'm really curious as to why we would want to overcome a drive
> > toward self preservation? To what end?
>
> Ah, well it depends on whether you are a "true" extropian or
> just wearing the cloth. If Eliezer and/or people working
> on similar efforts are successful and a transcendent AI/SI
> develops, then it would be moderately to extremely unextropian
> to seek to preserve yourself in the face of this. If you
> look into the depths of a "god" and realize that it is
> far far far superior for evolving itself into the greatest
> complexity and self-expression possible (relative to you,
> hanging onto outmoded notions of preserving your former self,
> not slashing and burning your ineffective copies due to remnants
> of human moral beliefs, etc.) -- then you clearly have to justify
> hanging onto your matter and consuming energy in what is clearly
> a sub-optimal state from the perspective of combatting entropy.
I think the problem with this claim is that extropy is not in itself about
combatting entropy. That could be served just as well by turning off all
stars and freezing the universe, which I think would go against most of our
visions and be exceedingly boring. Rather, extropianism seeks to increase
*value* in the universe. But value is not something you can easily measure on
a scalar scale, so hence you cannot apply the utilitarian approach that the
value of one SI equals (say) a billion human values. An SI plus a human is
more worth than just one SI or vice versa, but that partial order is the best
we can do. Hence we must instead look at what systems of rights and
interactions that best allow agents to increase their value, hopefully with
maximal efficiency.
> (Henceforth, I will refer to this as the "Extropians Dilemma".)
> [I believe that I've stated this in prior emails or presentations
> with the quote, "You must give up everything you are for what
> you might become."]
As David Zindell put it (constantly): "To live, I die".
> So the problem is far far worse than you might suspect. Not
> only must you give up your "attachment" to your resident
> consciousness in your current biological body, you must also
> give up your attachment to your consumption of matter, energy,
> and ideas that consume either of those resources. Rejection
> of these ideas immediately places you in the Bill Joy camp --
> "here we go and we shall go no further".
I think you demand a bit too much extropian orthodoxy (what a nice
contradiction in terms!) here. If I have the highest ambitions (very
morning I chant the affirmation "I am evolving closer and closer to the Omega
Point" while shovering :-) then I must also give up nearly everything as you
say. But suppose my ambitions are less dramatic, just seeking to fulfill some
needs (lets say becoming something akin to a Greek god)? I would definitely
not say the transhumanist seeking just immortality, beauty and creative power
is in the same camp as Bill Joy.
The difference lies in how we interpret "I won't do that". Bill Joy thinks it
implies "Nobody should do that", while a liberal transhumanist interpretation
would be "I won't do that, but go ahead if you want to. Just don't come back
crying to mommy when you find out that you didn't like to be a sentient
internet protocol!".
> So here is the challenge -- can anyone come up with an argument
> that justifies the preservation and continued "operation" of
> oneself in the face of clear evidence that more efficient
> means (i.e. means that consume less matter & energy) are
> available to "execute" ones consciousness? Furthermore,
> can anyone present an argument that the occupation and execution
> of said means is justifyable if more efficient means to generate
> information content (extropization) are available?
I did a rough sketch above, based on the subjectivity of value. It can of
course be refined, but I don't have the time and skill for the moment. But I
think the argument has been done very eloquently in much of the liberal and
libertarian literature (where it of course merely dealt with humans) that
unless everybody has adopted the same standards of what constitutes value,
the best way of ensuring that value develops (as experienced by the different
agents) is to give them maximal freedom to do so, including the very
implortant right to continue existing.
If everybody and everything agreed that it was just information production
that matters, then the conversion of the universe into computronium would be
a trivial issue. But it still doesn't follow that one SI produces information
as efficiently as a large number of HIs (modulo some overhead, that can be
fixed by shared libraries and handwaving :-), especially if variety of the
information is regarded as valuable.
> You have "inalienable" rights by virtue of being born a human.
> Are those rights "inalienable" if you choose to be an extropian?
I would suggest extending these rights into "posthuman rights", that hold for
any entity that is an ethical subject.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:51 MST