From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Sep 07 1999 - 11:06:52 MDT
Any system that can't think its way out of a local optimum has
insuffient reflectivity or self-awareness to be successful as a seed AI.
If the system can't say, "Gee, I guess I'm at a local optimum, better
think my way out of it," then it probably lacks the basic cognitive
faculties to understand its own design at all, much less improve it.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:04 MST