From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Mar 10 2005 - 18:53:38 MST
Hal Finney wrote:
>
>>The modesty argument is important in one respect. I agree that when two
>>humans disagree and have common knowledge of each other's opinion (or a
>>human approximation of common knowledge which does not require logical
>>omniscience), *at least one* human must be doing something wrong.
>
> I'd put it a little differently. There's nothing necessarily wrong
> when two humans disagree and have common knowledge. You have to add one
> more ingredient. The people have to both be rational and honest, and,
> most importantly, they each have to believe that the other is rational
> and honest (and, I think, this has to be common knowledge).
Suppose that one party is not rational. I would call this, "doing
something wrong".
We presume that both parties are honest because otherwise they are not
"disagreeing" in the sense that I mean it, i.e., assigning different
truth values.
Suppose that one party is rational and the other party fails to realize
this. Then the second party has failed to arrive to the correct answer
on a question of fact. Again, "something wrong".
If they just haven't figured it out yet, then they aren't necessarily
doing something wrong. They may be doing something right that takes
time to accumulate evidence and computationally process it. If they
take too long or demand too much evidence, the beisutsukai sensei shouts
"Too slow!" and whacks them on the head with a stick. Speed matters in
any martial art, including rationality, the martial art of thinking.
> I would imagine that many cases of disagreement can be explained by
> each party privately concluding that the other is being irrational.
> They're just too polite to say so. When they say, I guess we'll have
> to agree to disagree, they mean, you're being unreasonable and I don't
> want to argue with you any more because there's no point.
>
> But actually, we can sharpen Aumann's result. It doesn't require
> assumptions about two people. It is enough for one person to satisfy
> the conditions.
>
> Aumann basically says (neglecting the part about priors) that it is
> impossible for a rational person to believe that he has a persistent
> disagreement with another person whom he believes to be rational, where
> the other person also believes the first person is rational.
I am not sure this is correct. Maybe there is an extension of Aumann
that says this, but it's not in Aumann's original result, which presumes
rationality (i.e., irrationality is not considered as an option).
Aumann-ish results, as far as I can see, tend to be about Bayesians
treating other Bayesian's opinions as bearing a specific evidential
relationship to the question at hand - the signals wouldn't have to be
beliefs; they could as easily be flags that waved with a certain
likelihood ratio. In fact, what else is a Bayesian's belief, but a kind
of cognitive flag that waves at only the right time?
> Aumann is giving us a non-obvious piece of logic which we can follow in
> our own thought processes, independent of what anyone else does. I can't
> fool myself into believing that I can agree to disagree with another
> person, while respecting him as a rational and honest person who offers
> the same respect towards me. For me to hold this set of beliefs is a
> logical contradiction. That's the lesson I draw from this set of results.
I agree. But I regard rationality as quantitative, not qualitative. I
can respect an above-average rationalist while still occasionally
wanting to shout "Too slow!" and whack him with a stick.
> In a way, then, Aumann can be read as giving you license to feel
> contempt for others. He's saying that it is mental hypocrisy (if that
> means anything!) to try to adopt that generous and polite stance I just
> described. When we try to convince ourselves that we really believe
> this noble fiction (that the other person is rational and honest), we are
> lying to ourselves. It's another case of self-deception. The truth is,
> we don't respect the other person as rational and honest. If we did,
> we wouldn't be ignoring his beliefs! We think he's a fool or a knave.
> Probably both. We're not so damn nice as we try to pretend to be,
> as we try to convince ourselves we are.
Or you think that he's good and you're better. That is also a
self-consistent position to hold. And if that is your position, you'd
best not hide it from yourself - though based on my experience so far, I
can't claim there will be any benefits forthcoming from public honesty
about it.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT