Re: Any strong belief is a chain.

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Dec 22 1998 - 16:38:36 MST


Robin Hanson wrote:
>
> Eliezer S. Yudkowsky wrote:
> >Of late, the opinion has been expressed that it's okay to believe
> >strongly in something as long as you have evidence for it - in other
> >words, "dogma" describes a belief that is held strongly in the absence
> >of evidence. I disagree. Any strong belief is a chain. ...
> >Any strong belief causes the opinion to persist in and of itself and
> >slows reaction to new information. Translation: It makes you stupid
> >and slow.
>
> Bayesian decision theory gives a nice account of how one's degree of
> belief in some claim should vary as evidence accumulates. It allows
> for assigning probabilities close to one when the evidence is strong.
> I presume you mean something else by "strong belief."

Not really. I have no problem with assigning probabilities close to
one. When I refer to a "strong belief", I am referring to a cognitive
event over and above that involved in having a statement about
probabilities; I think that the best phrasing I can apply now would be
that a "strong belief" means that you care. I have a pretty good
probability that the sky is blue, but if it was orange, so what? By
contrast, many people would be deeply disturbed to learn that there
is/is not a God. They probably assign roughly the same probability to
the two statements "sky is blue" and "my religion is correct", but the
behaviors of the beliefs differ. Strong beliefs are "sticky"; it's hard
to change them.

Let's take my own belief that "strong beliefs are bad" as a case in
point. If I had a strong belief that "strong beliefs are bad", I
wouldn't be able to consider circumstances under which "strong beliefs
are good". It may well be that having a strong belief activates various
cognitive abilities, particularly social abilities, but possibly even a
different type of search for evidence, or an expanded search for effects
caused by the believed-in object.

So I think people should train themselves out of strong belief, but I
would anathematize any attempt to neurosurgically disable the "strong
belief" capability... without a much better idea of what's going on,
anyway. This is not something I would say if I had a "strong belief",
even though I still think the probablility that a strong belief will
screw you up in any given situation is close to 1 (except in
deliberately constructed special cases).

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:50:05 MST