Re: Yudkowsky's AI (again)

From: den Otter (neosapient@geocities.com)
Date: Thu Mar 25 1999 - 15:17:39 MST


----------
> From: Eliezer S. Yudkowsky <sentience@pobox.com>

> Bryan Moss wrote:
> > Yudkowsky wrote:
> > > If your self is preserved, you wouldn't kill off your
> > > fellow humans, would you?
> >
> > Yes.
> >
> > BM
>
> Okay, we now have:
>
> The Official List of People Not To Let Anywhere Near An Uploading Device:
> 1. den Otter.
> 2. Bryan Moss.
>
> Any other volunteers?

How about yourself? Never *ever* trust someone who jumps at
every opportunity to ride the moral high horse. Those are
the worst witch-burners. Besides, didn't you write repeatedly
that you value the Singularity above everything else? Aren't
you the one who wants to place humanity at the mercy of
your pet AI?

Let's cut the sanctimonious crap and debate this
as reasonable people, ok?

Here's link that may help to clear things up, do click it:

http://pierce.ee.washington.edu/~davisd/egoist/articles/Egoism.Robinson.html



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST