From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Sep 06 2000 - 09:22:44 MDT
Wilson wrote:
>
> > From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
> >
> > It would be nice, for instance, to have SIs that are made happy by making
> > humans happy.
This is not a quote from me!
> I remember reading a Sci-Fi book about this in the 80s.. Robots were given
> the command "Make humans happy." After many years, they decided, based on
> the actions of humans, that life itself irritated them, given how much they
> complained about it. The obvious solution was to kill every human in the
> universe.
Yes, this comes under the "Why the SI needs a grain of common sense"
department.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:49 MST