From: J. R. Molloy (jr@shasta.com)
Date: Sun Sep 24 2000 - 15:00:09 MDT
Eliezer S. Yudkowsky writes,
> If, as seems to be the default scenario, all supergoals are ultimately
> arbitrary, then the superintelligence should do what we ask it to, for
lack of
> anything better to do.
That sounds like you're putting yourself in the AI's shoes.
--J. R.
"You can't put yourself in the AI's shoes!"
--Eliezer S. Yudkowsky <sentience@pobox.com>
Sunday, September 24, 2000 12:29 AM
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:09 MST