From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Nov 05 2001 - 17:01:56 MST
Spudboy100@aol.com wrote:
>
> Borrowing from the Yudkowsky mailing list, "Friendly SI-AI SYSOP, bless America, our home sweet home.." It kind of crunches down to the same thang. Sans, the supernatural (defined by me as permanently inexplicable to technical analysis)stuff.
No. For me the defining quality of rational action - as opposed to
mysticism - is realizing that our actions affect reality, but not our
wishes. Or maybe I should say that only actions affect external reality,
since the inside of the mind is a more plastic place.
Even under a post-Singularity Sysop Scenario a billion years in the
future, it's not your wishes alone that alter reality... it's the actions
taken by the six billion pre-Singularity humans ever so long ago, the
actions they took back then so that your wishes might be granted now.
There's no such thing as a free lunch. There are some lunches that are
very, very cheap when amortized over a few billion years and a few
septillion sentient beings, but it still costs in the here-and-now.
And there's no point at all in praying to a Friendly mind of any kind,
Sysop-level or otherwise; no Friendly mind would demand reverence or
obedience or faith or anything of the sort.
The division between rationality and thinking any damn thought that pops
into your head is valuable and dearly bought. So is the distinction
between technology and magic, or between a world of Similarity and
Contagion and a world of cause and effect. We shouldn't throw it away so
cheaply by casually conflating a Friendly superintelligence and a
psychotic superpowered three-year-old in the sky.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:11:49 MST