Re: Yudkowsky's AI (again)

From: Lee Daniel Crocker (lcrocker@mercury.colossus.net)
Date: Mon Mar 29 1999 - 16:23:43 MST


> This kind of thinking weakens you. This is not the way to see reality
> clearly. On a battlefield, in business, or anywhere, the one who sees
> clearly wins. Our way of thinking (“calibration”) is exemplified by the
> geniebusters site. It strengthens us. It does lead to clear perceptions.

I can think of a much better measure of clear-headed thinking: poker.
In war, technology and physical skills have a big impact, and the
game is very negative-sum. Business is so positive-sum that even those
with fuzzy minds can make money. Poker, on the other hand, is a pure
zero-sum contest of minds rationally evaluating the exact odds of
possible outcomes, the gain and loss of each, investing ("raise")
and liquidating ("fold") as appropriate, winner take all. In many
years of experience--including 6 months as a pro--I can confidently
state that all misconceptions, superstitions, and emotional attachments
are quickly punished by the clear-headed.

--
Lee Daniel Crocker <lee@piclab.com> <http://www.piclab.com/lcrocker.html>
"All inventions or works of authorship original to me, herein and past,
are placed irrevocably in the public domain, and may be used or modified
for any purpose, without permission, attribution, or notification."--LDC


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:26 MST