From: Lee Corbin (lcorbin@tsoft.com)
Date: Fri Nov 22 2002 - 23:01:34 MST
Eliezer writes
> > I hope that I'm not just quibbling, but it's possible
> > that your apparent difficulty in coming up with a good
> > example is significant.
>
> It is, in fact.
Your honesty and candor are always refreshing.
> It was pretty hard to find something of which I disapproved
> rather than something MORALLY WRONG, since I tend to regard
> mere disapproval as a free variable that I can choose to
> overwrite in the service of less flexible moral conclusions.
> This being the case, it tends to get overwritten.
Yes.
> > Your disapproval of murder is also almost universal, as is
> > mine, and as is my disapproval of people undergoing suffering.
>
> Gosh. Sounds like it is an empirical fact that more than one human can
> cooperate on projects to do something about it.
Yeah ;-) I'm sure that it's a fact, and empirical to boot!
> > I guess that it only remains to determine what you mean by
> > "X is wrong", if it is possible, that is, to say the same
> > thing in different words. Can you do it?
>
> Let's see:
>
> 1) It's an empirical fact that I [Eliezer Y.] will make
> decisions that attempt to eliminate or minimize X.
Yes. An objective analysis of the solar system at this
time would confirm that.
> 2) It's an empirical fact that my present-day attempt to eliminate or
> minimize X will not automatically respect variance of my future self's...
Yes.
> 3) Empirical fact (2) is a consequence of the way that X's negative
> "desirability" - desirability here being the computed quantity that
> controls the decision between alternatives - is derived by reasoning
> from cognitive representations that contain no mention of "subjective
> desirability" as a hypothetical future reflective perception.
Yes, that's the way systems making judgments work, so far as I can tell.
> 4) Insofar as the undesirability of murder is a consequence of reasoning
> from premises that are shared between humans, the conclusion may also be
> shared between humans.
How is the undesirability a consequence of reasoning
from premises shared between humans? Moreover, what
if it turns out that for "most humans"---say all
homo sapiens from 200,000 B.C. to the present, only
a minority would hold that murder is wrong? As for
me, God Himself along with all the philosophers could
announce that torturing kittens is fine, but it would
not be fine with me. Yet I cannot for the life of me
imagine how They could prove their case, or I mine.
> This expectation plays a significant role in the choice
> to describe X as "morally wrong", i.e., knowably morally
> wrong given an expectably shared set of moral premises.
Yeah, but the "expectably shared set of premises" is where
it all becomes kind of arbitrary (in a sense), doesn't it?
That is, certain bad guys just don't share the premises
that we do.
> This doesn't really deal with the question of whether "right" and "wrong"
> can really be said to "refer" to anything. "Truth", for example, is not a
> physical property of a belief, yet turns out to have a ready
> interpretation as the correspondence between a belief and reality.
Yes.
> Numbers are not physical substances, but I would not regard 2 + 2 = 4
> as subjective.
No, that's completely objective, in the sense of not depending
on how anyone is thinking of it.
> Morality might turn out to have the same kind of interpretation.
> For example, if we accept that, for whatever reason, murder is
> "undesirable" - maybe undesirability is somehow a physical
> substance inherent in the murder event...
I have a friend who maintains that pain and pleasure are
extensional physical variables computed by our universe.
But I regard all this as rather speculative, to say the
least.
> Sorry for the misphrasing, but my compliment stands: you
> are unexpectedly self-consistent.
Why, thank you. Consistency is my most prized objective
(since I worry that there is little else that can be truly
accomplished).
> I'd still say force was justified to defend a simulation
> of almost anyone but you - or possibly even you, under
> your definition...
;-) One also waxes skeptical of "justificationism". Alas,
I'm getting to be of the NewSpeak school, dismissing any
sentences or concepts that utilize the forbidden notions.
I will just say LOUDLY and OFTEN to anyone who is within
earshot, I will support the use of force to rescue a
simulation, unless that would violate some other even
more powerful tradition (e.g., private property). I
will in any case LOUDLY and OFTEN severely scold those
who torture sentients, and probably won't do business
with them.
> ooh, now *there's* an interesting moral question. If you found you were
> a simulation, would you refrain from doing anything that might damage the
> computer on which you run, since it is someone else's private property?
If the owner knew all the facts of the situation, then I
would consider myself a guest. But just as if I were a
guest at your castle, and I found myself mistreated, I
would wish to escape. Moreover, if I found that you were
treating me unkindly, then I would try to retaliate.
(Someone might ask, "but Lee, what would give you a right
to retaliate?", and I would just go ballistic again,
denouncing their use of the word "right".)
> Or would you want to be free? Or is the social compact
> of the simulators not binding on the simulation?
Interesting: "social compact". I wonder if that's what I've
been defending when I defend private property. Private property
is a *social compact*, I guess, that has shown itself to be
utterly necessary for the evolution and progress of civilization.
Its lynchpin, if you will.
But to ask just who is bound by it is to ask a very good
question. The usual situation with regard to private property
seems to apply easily to the case where you are running your
own simulations inside your own castle, and we are on the
outside wondering what you are doing. But from the inside, well,
this is something quite new evolutionarily. Trying to draw on
tradition, we note that God gave dominion of the Earth to Man,
and by analogy, I get dominion of as much of the hardware inside
the simulation as I can get hold of. Hmm. Quite interesting.
I'm not at all sure that I have achieved consistency here.
> If you were the simulation, would you want me to break you
> out? Or would you ask that I not deface the private
> property of the person holding you prisoner?
Well, that's certainly putting the screws to my position!
I think that I would probably want you to rise above
principle in this case. But I long ago conceded that
I have *two* value systems that cannot be made completely
consistent: what's good for the universe, and what's
good for me, and I used to spend a fair amount of time
torturing myself with strange scenarios that put them
in conflict.
Lee
This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:58:19 MST