From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Feb 27 1999 - 20:17:54 MST
Billy Brown wrote:
>
> Now, I wouldn't abandon the whole foundation of my moral system overnight,
> and I don't expect the AIs to do it either.
I do. I can recall two occasions (separated by several years) when my
entire moral system crashed and was rebuilt along totally different
lines, over the course of five seconds. That is the actual time scale,
not an exaggeration. It was obvious both that the old philosophy was
wrong, and how the new philosophy ought to work. From my perspective
the shock, the fall, and the rebuild occurred without conscious effort
or iterative thought, in a single step.
These changes occurred when I read the following two sentences:
"You are not the person who speaks your thoughts, you are the person who
hears your thoughts." (-- ?)
"It's a problem we face every time we consider the creation of
intelligences greater than our own." (-- True Names, p. 47)
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:10 MST