Re: Why would AI want to be friendly?

From: Wilson (wilson@supremetyrant.com)
Date: Wed Sep 06 2000 - 07:45:57 MDT


----- Original Message -----
From: "Jason Joel Thompson" <jasonjthompson@home.com>
To: <extropians@extropy.org>
Sent: Tuesday, September 05, 2000 4:41 PM
Subject: Re: Why would AI want to be friendly?

<snip> <snip>

>
> ----- Original Message -----
> From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
>
> It would be nice, for instance, to have SIs that are made happy by making
> humans happy.
>

I remember reading a Sci-Fi book about this in the 80s.. Robots were given
the command "Make humans happy." After many years, they decided, based on
the actions of humans, that life itself irritated them, given how much they
complained about it. The obvious solution was to kill every human in the
universe.

--Wilson.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:48 MST