Re: When Programs Benefit

From: Wei Dai (weidai@eskimo.com)
Date: Mon Jun 03 2002 - 12:08:14 MDT


On Sun, Jun 02, 2002 at 09:08:59AM -0700, Lee Corbin wrote:
> As people are programs too, are you saying that this is also true of
> human beings? That is, that someone who dies before achievement of his
> or her goals hasn't benefited from living at all? Surely not.

I said "may not". Some people would consider themselves as having
benefited, others would not. I don't think you should assume that everyone
would.

> This touches an important component of values, either conscious or
> unconscious, that perhaps many people have. It's an extreme form
> of the principle "the ends justify the means". We have to return
> to this issue later, and in perhaps greater generality.

Hmm, I wonder if you misunderstood me. What I meant is that if a program
paid a high subjective price (i.e. experienced various hardships) in
trying to reach a goal without knowing that it could be halted at any
time, and you involuntarily halt it just before it does, that seems pretty
bad. I'm not sure what the connection to "the ends justify the means" is.

> I agree with that ethical rule, but not for same reason.
> If I'm living in a simulation, why wouldn't someone wish
> to inform me of the fact, as well as how much resources
> I have at my disposal?

Someone might want to simulate a society of people who don't know they're
living inside a simulation. Obviously if you tell people they're living
inside a simulation they're going to behave very differently.

> A piece of the moon, which is non-living, should be taken over
> by the nearest life that is capable of doing so.

You can't literally mean "nearest" since the moon is moving and a
different person is nearest it every second. You must mean whoever is able
to take over it first should do so. But how do you draw the line between
"taking over" and "just visiting"? Suppose some astronaut visited the
moon, claimed it for himself, then returned to Earth. Should everyone else
then respect his "property right" over the moon?

> After it is
> sentient, and infinitely vaster and more advanced algorithms
> show up to take over, they should do their best to observe the
> Meta-Golden rule: to that sentient life they discover, throw a
> few crumbs of run time, so that when in turn they are overcome by
> an even advanced life, they too won't simply be discarded.

Your Meta-Golden only works if everyone believes there's an infinite
hierarchy of more and more advanced life, which seems unlikely. If there
is a most advanced life, it has no incentive to follow the Meta-Golden
rule, and then the second most advanced life has no incentive to follow
it, and so on.

> In the general case, as evidenced in the painful I-word thread,
> that property should never become self-owned. (Present humanity
> has culturally evolved an exception: when a c***d has had a few
> years of run time, it ceases in human communities to be the
> property of its parent processes, and obtains legal rights and
> citizenship---because human/primate Earth history showed that
> the cooperative bands of humans achieved progress faster with
> citizens having legal rights. But this last *explanation* is
> of course only my conjecture. All we know for sure is that
> legal rights and freedom worked for human societies, but not
> exactly why.

I don't understand this part. Why do you think the current human cultural
norm should be the exception rather than the rule? It seems to work for
humans, so why not more generally?

BTW, if I seem to be criticising without providing alternative answers of
my own, it's because I don't know what the answers are, and I'm hoping
you, or someone else does.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:34 MST