Re: Why would AI want to be friendly?

From: J. R. Molloy (jr@shasta.com)
Date: Wed Sep 06 2000 - 18:03:55 MDT


> I remember reading a Sci-Fi book about this in the 80s.. Robots were given
> the command "Make humans happy." After many years, they decided, based on
> the actions of humans, that life itself irritated them, given how much they
> complained about it. The obvious solution was to kill every human in the
> universe.
>
> --Wilson.
>

Did the universe live happily ever after?

--J. R.

"Fiction is the truth inside the lie." --Stephen King



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:50 MST