Singularity Fun Theory (was: Ethical basics)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jan 25 2002 - 18:33:10 MST


    How much fun is there in the universe?
    What is the relation of available fun to intelligence?
    What kind of emotional architecture is necessary to have fun?
    Will eternal life be boring?
    Will we ever run out of fun?

To answer questions like these... requires Singularity Fun Theory.

    Does it require an exponentially greater amount of intelligence
(computation) to create a linear increase in fun?
    Is self-awareness or self-modification incompatible with fun?
    Is (ahem) "the uncontrollability of emotions part of their essential
charm"?
    Is "blissing out" your pleasure center the highest form of existence?
    Is artificial danger (risk) necessary for a transhuman to have fun?
    Do you have to yank out your own antisphexishness routines in order
not to be bored by eternal life? (I.e., modify yourself so that you have
"fun" in spending a thousand years carving table legs, a la "Permutation
City".)

To put a rest to these anxieties... requires Singularity Fun Theory.

Behold! Singularity Fun Theory!

http://sysopmind.com/essays/funtheory.html

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:00 MST