From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 24 2002 - 06:20:05 MDT
Eugen Leitl wrote:
> On Mon, 24 Jun 2002, Eliezer S. Yudkowsky wrote:
>
>> I try my best. That's not what I'm asking, however. I am asking
>> *you*, specifically, one individual with one viewpoint, whether you
>> consider your present sympathy with humans to be irrational, and if so,
>> why you keep on doing it; or, if it is not irrational, why you expect
>> that you would lose this sympathy as an eventual Power. I am not
>> asking you whether you believe
>
> There is no dichotomy. I currently profit from transactions with people
> because they're mutually profitable. If one party undergoes a
> transformation completely breaking that initial symmetry the transaction
> becomes essentially unilateral, since there is no meaningful
> reciprocation possible, and hence sees negative fitness.
Okay, so as I understand it, you do not see other sentient beings as ends in
themselves, but rather as means to an end; by cooperating with others you
increase your own benefit. You may take into account such game-theoretical
considerations as trust and reputation, but the "supergoal", if I may use
such a term, is your own survival and accumulation of personal resources
(and reproduction)? As a Power your goals would continue to be pure
personal survival, but you would no longer have any use for other human
beings. Hm. I don't understand why you were talking about empathy earlier,
then, since you don't seem to be invoking it within your own morality.
Incidentally, we have several people on this list who claim to be
"rationally selfish" in this sense, so if this *isn't* your opinion, I'm not
trying to be ad hominem in asking.
>> that all sentient beings must think the same way, or all humanborn
>> Powers; I am asking whether *you personally* think that you are
>> presently being immoral by the correct standards of your future self,
>> or whether your future self is being immoral by the correct standards
>> of your present self. Correct from whose viewpoint? Why, yours, of
>> course. You can talk about the observer-dependence of morality all you
>> like, but I assume that you personally have a morality of some kind.
>> Is your future self being immoral? Or are you being immoral? You can't
>> both be moral under any single moral standard. So, under your single
>> moral standard as a single individual, who's wrong?
>
> You are, because you postulate a single viewpoint-invariant moral
> standard. Both me-current and me-future are acting morally (in the
> rational selfishness sense), it's the metric which has shifted smoothly.
Under rational selfishness, I don't see why you're saying that the moral
metric has shifted. Both you and your future self are concerned only your
personal survival as an end in itself, with other actions being simply
means. Now, if you were to regard other sentients as ends in themselves
whose survival you intrinsically desired, as I thought you were saying
earlier, then the Power-your-future-self would be acting immorally under
your current standards, period, regardless of how the Power might see the
matter.
> Populations exert pressure on decisions of an individual. Most people
> don't play all their day with bugs. Symmetric-transaction expectation
> when dealing with asymmetric transactions are *not* a selection-neutral
> trait. You do that long enough, you lose.
>
> I'm a bit at loss that I need to explain this. Isn't this obvious?
Not really. Our lives may be dominated by emergent properties of Darwinian
selection, but post-Singularity I doubt that Darwinian selection will be
much of a force compared to intelligent planning. Frankly it's not clear to
me under your model why you or any other mind would ever reproduce, or why
reproducers would outcompete resource-absorbers, or why resource-absorbers
couldn't cherish any number of sentients inside them if that was their
favorite pasttime.
Your world doesn't seem to allow for any fun whatsoever post-Singularity.
Every computing cycle in existence is devoted to plotting how to get a lock
on every available resource that opens up, murder any entity that leaves an
opening, and prevent oneself from being murdered. Anyone who devotes less
than every available computing cycle quickly falls off the wagon and is
murdered. There's no possibility of having enough thought to spare even to
cherish a few uploads inside you. Frankly I'm not sure I see much pragmatic
difference between a post-Singularity world like this and a world of
unintelligent grey goo. I don't see any personal continuity either. What
matters is the number of computing cycles and resources available to do
meaningful things, such as (possibly) sentients having fun. Eliminating
every meaningful computing cycle in the Darwinian motivational catastrophe
you postulate doesn't strike me as much different from eliminating every
meaningful computing cycle with a supernova.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:59 MST