Re: Why would AI want to be friendly?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Sep 24 2000 - 22:50:21 MDT


Samantha Atkins wrote:
>
> "J. R. Molloy" wrote:
> >
> > You'd need the most talented leadership in the entire world, because the entire
> > world is more interested in Olympic games and making money than it is interested
> > in the most important job. The most important job appeals only to the most
> > intelligent and conscientious brains.
>
> Amen. You first need to convince a sufficient number of people that
> your diagnosis of the most important thing really is the most important
> and the only hope and that your design is fundamentally sound. This is
> not a trivial task and no, not all people of sufficient caliber to be
> useful to the work will get it from the first.

Yes, I used to think that way. I can remember when I used to think that way.
I can remember when the possibility of losing even one person was so horrible
that it overrode everything else in the mental landscape. But we don't need
every single person of sufficient caliber. Let's say that the raw info will
get 80% of the PoSCs, and nicey-nice phrasing will get 85%. Is the difference
really all that significant? Is it worth a 300% increase in authorial time
expenditure?

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:10 MST