hal@finney.org wrote:
>
> Eliezer S. Yudkowsky, <sentience@pobox.com>, writes:
> >
> > If a 100,000:1 genius is interacting with a 10,000,000:1 genius, but
> > neither of them knows the other's percentile, both will rationally assume
> > that they are more likely to be rational than the other person. However,
> > Robin's paper does prove that in *most* cases, rather than in the rarer
> > instances where two geniuses unknowingly interact, people must be
> > overestimating their own rationality relative to others, or else must not
> > be using rigorous Bayesian reasoning with respect to what they are
> > licensed to conclude from their own thoughts.
>
> It seems that if both parties clearly explain their understanding about
> this paradoxical result, each would be forced to accept that the other
> had made at least a prima facie case for being rational enough for the
> result to apply. They should then feel mutually bound to reach agreement.
Actually, I have this wacky idea that says it's ethically prohibited to
apply Bayesian priors to people. In other words, you should judge a
person's properties - intelligence levels, for example - only by
information that is directly revealing of those properties, and not
through other properties that happen to be weakly or strongly associated,
especially if those properties happen to be outside the person's direct
manipulative control. For example, if a nine-year-old were to walk up to
me and ask a question that I would take seriously if an adult had asked
it, I am obliged to answer as I would an adult, and am not permitted to
take any action predicated on my knowledge of chronological age until the
person in question makes some statement that is actually revealing of
nine-year-old characteristics. (Not in the sense of "being interpretable
as being revealing of youth", but in the sense of being a statement that
would *only* have been generated by a youth and not by an adult, or a very
high-probability equivalent of same.)
So if someone says "I'm smarter than you, Eliezer!" I would certainly feel
obliged to assume (or rather, act under the assumption) that the person in
question is as likely to be smarter than me as I am to be smarter than
them, at least until the person makes some other statement revealing of
actual intelligence levels, which generally doesn't take too long.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:02 MDT