From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Nov 24 1999 - 12:36:23 MST
I don't understand why you think AIs would have "interests", or develop
instincts for self-preservation. In humans, those emotions are the
result of millions of years of evolution. Selfishness is natural to
humans, not natural to minds in general. Complex functional adaptations
do not simply materialize in source code, any more than three-course
meals grow on trees.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:51 MST