From: Samantha Atkins (samantha@objectent.com)
Date: Mon Apr 30 2001 - 14:01:18 MDT
"Eliezer S. Yudkowsky" wrote:
>
> Ben Goertzel wrote:
> >
> > Basically, Eli, as much as I like and admire
> > you, the attitude "Let others worry about the starving people in the Sudan;
> > my job is to save them by building a Friendly AI that will produce the
> > Singularity" sort of gets on my nerves.
>
> I AM worried about the starving people in the Sudan. And repression in
> China. And child abuse and urban poverty in the US. And the work
> environment that slowly sucks the life force out of the middle class. And
> everyone who isn't as smart or as famous or as important as they want to
> be. And everyone, everywhere, every moment of every day, who dies, and
> would rather have lived.
>
> I didn't inherit a family business in AI. I didn't fall into this because
> it had an easy college major, or because I was curious about AI. I
> deliberately set out to ameliorate suffering on the largest possible
> scale, and AI is how I chose to do it.
Yes. I understand that and deeply admire and applaud it. It would help
if that purpose was more visible to world in general. It might also
help if
some of us have a story for the interim between now and the creation of
a friendly SI. If that story is "suffer it to be so for a little while
longer" then so be it. But I think it is important to know what the
story is and why. It is important both for mass consumption and for the
peace of mind and vision of those doing the work.
- samantha
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:22 MST