From: Robin Hanson (rhanson@gmu.edu)
Date: Wed May 02 2001 - 08:25:14 MDT
At 03:17 AM 5/2/2001 -0400, Curt Adams wrote:
>I mean it. Read the damn paper. http://hanson.gmu.edu/deceive.pdf or .doc
>(Robin, could you make a Web version?)
Thanks Curt! Maybe we will make a web version after the next revision,
coming soon, once we integrate the current round of comments.
>I agree with the conclusion of the paper - that people form ideas for their
>personal benefit, not in search of truth. ...
This is of course easier to conclude about others than about yourself.
>I think though, that the paper overlooks other limitations on human Bayesian
>behavior, primarily limited computational ability. I would say most of the
>non-Bayesian behavior you outline can be explained by human inability to be a
>good Bayesian, as well as by the fact that people aren't trying to be good
>Bayesians in the first place. ...
The section on Bayesian-wannabes, and the longer paper that section summarizes,
are intended exactly to address this. They talk about someone who would be
Bayesian if they could compute it, but can't so compute. It also seems
irrational for such wannabes to disagree.
>First, as I've said before, people aren't good Bayesians. Being a good
>Bayesian requires logical omniscience. ...
Btw, that is actually not quite true. A famous paper by Garber long ago
showed how to be a Bayesian who is not logically omniscient. But admittedly
even this sort of Bayesian requires heroic computational abilities.
>Resolving disputes by commonizing prior is a cost-benefit negative. Changing
>priors is horrendously expensive, in the sense that all those mind-expensive
>probability calculations must be redone. At the same time there's no benefit
>- one prior is as good as another. ...
The whole idea of being a truth-seeker is that you should be willing to pay
some cost for truth. You must be willing to spend some effort trying to
overcome inherited biases. It is not clear to me that such efforts must be
horrendously expensive, though I agree that they have some expense.
A simple cheap strategy is to say "Oh, I'm disagreeing with someone. That
wouldn't happen if we were both truth-seekers, so unless I have reason to
think they are more likely to be irrational than me, I would move my opinion
in their direction."
>The John and Mary example also assumes John and Mary can efficiently exchange
>information. This doesn't match human experience. It's very hard to
>communicate experience ... John hasn't time to get all
>Mary's data, or vice versa.
The whole point of the analysis is that all they need to exchange is their
summary opinions. They do *not* need to exchange their detailed data.
>Sorry my comments are so negative. I really liked the paper. But, you know,
>I gain little from accepting your conclusions and a lot if I get to put out
>something successful of my own ... :-)
I think of your comments as constructive - thanks!
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Asst. Prof. Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030-4444
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:26 MST