From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jan 12 2001 - 10:31:25 MST
John Marlow wrote:
>
> I suggest to you that the entire effort to create and
> empower a nanny AI can end ONLY in disaster. The thing
> will have no allegiance to us, no dependence on
> us
If a human had no dependence on us, s/he would have no allegiance to us.
This, again, is a non sequitur for AIs. An AI has allegiance to whatever
it has allegiance to.
> --will no more "relate" to us than we do to insects
Again, you must realize that the leap from material ascendancy to social
ascendency is one that only makes sense if you evolved in a
hunter-gatherer tribe.
> or, perhaps more appropriately, to the descendants of
> those more primitive life-forms from which many
> believe we have evolved.
Do you question this theory?
> Tell me--when you turn on the hot-water faucet, do you
> think about the bacteria in the drain being scalded to
> death?
The bacteria didn't write my source code!
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:48 MST