The problem is simple: Neurons can't multiply.
Logic says a millionth chance of losing a thousand lives works out to one
thousandth of a life lost.
Our cognitive risk-analysis mechanisms aren't wired up that way. Neurons
can't multiply.
My guess is that there are seven states of probability: Impossible,
improbable, unlikely, unknown, likely, probable, and certain. Likewise,
disasters are measured as apocalypse, horror, death, torture, pain, discomfort
and twinge. Make a "multiplication table" and it won't look anything like
linear, or even N equations in N unknowns.
Not only that, but when it comes time to "balance" risks, it will turn out
that neurons can't even add.
> I really wonder what would happen if we could rewire our brains to
> become rational risk-analysts. I think society would turn *very*
> different...
I think that rational risk-analysis, despite the monicker, is an evolutionary
disadvantage. The way we work these models was optimized over a *very* long
time to account for rationalization, repression, phobia, greed, and all the
other forces that act to distort our "rational" analysis of the model. Make
us rational risk-analysts without making us rational, and society would indeed
turn very different - but perhaps not in the direction you wanted.
As a simple example: How can the chances of the probe malfunctioning be a
million to one? Do you really believe NASA on this one? Probably the
engineers are guessing something like three hundred to one, and management is
substituting their own elaborate calculations from the reliability of each
individual screw.
Read Feynman on the Challenger disaster. I'm practically quoting him.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.