Re: Why would AI want to be friendly?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Sep 24 2000 - 23:01:14 MDT


Samantha Atkins wrote:
>
> "Eliezer S. Yudkowsky" wrote:
> >
>
> > Okay, so why would I necessarily need more than a finite and limited amount of
> > charisma to handle that? If I needed incredibly skilled and talented hackers
> > to act as janitors, then yes, I (or someone) would need a lot of charisma.
> > But attracting people to the most important job in the entire world? How much
> > leadership talent do you need for *that*?
>
> Little would be needed if it was as obvious as it appears to be to you.
> A great deal is needed to make it more obvious in actuality. There is
> no reason to believe that all persons bright enough and motivated enough
> to be useful to the work will "get it" the first time around.
>
> >
> > Actually herding the cats once you've got them, now, that's another issue.
> > So's PR.
> >
>
> OK.
>
> >
> > > > I wish I knew more about dealing with people, but I no longer give it as high
> > > > a priority as I once did.
> > >
> > > How can that be anything but a mistake when you require people, since
> > > they are the only intelligences to use in getting this thing off the
> > > ground, and their resources in order to produce the Seed?
> >
> > My purpose, above all else, is to design the Seed. Other people can
> > persuade. I have to complete the design. If being charismatic requires
> > patterns of thought that interfere with my ability to complete the design, or
> > even if it starts taking up too much time, then forget charismatic. I'll stay
> > in the basement and someone else will be charismatic instead.
> >
>
> That's fair enough. As long as you have some people to do the necessary
> job of explanation and persuasion all should be reasonably well on that
> score.
>
> > > Do you really
> > > believe that all of those you need will just automatically think enough
> > > like you or agree enough with your conclusions that little/no effort is
> > > necessary on your part to understand and deal with them further? What
> > > kind of model leads you to this conclusion?
> >
> > Past experience, actually. The people I need seem to Get It on the first try,
> > generally speaking. I'm not saying that they don't argue with me, or that
> > they don't ask questions. Mitchell Porter has been right where I have been
> > wrong, on a major issue, on at least two separate occasions.
> >
>
> Or at least all the people you know you can work with and/or are on
> board today got it quickly (or at least seemingly so in hindsight).
>
> > The difference is pretty hard to put into words. I am not the judge of who is
> > or isn't asking "intelligent questions", and that's not what I'm trying to
> > say. What I'm trying to say rather is that there is a pattern. Mitchell
> > Porter groks the pattern; if he says, "Eliezer, you're flat wrong about X",
> > then at least we're both arguing within the same pattern. People who Get It
> > may agree or disagree with me, but they understand the pattern. Rarely, if
> > ever, do I see someone who didn't get the pattern suddenly get it after long
> > and acrimonious argument; the only person I can ever recall seeing do that is
> > Eric Watt Forste, which still impresses me.
>
> But it is certainly within the possible that the basic pattern itself
> has some flaws or at least reasonably questionable spots.

It's certainly within the possible that *I* have some flaws or at least
reasonably questionable spots, and indeed people like Mitchell Porter have
spotted flaws and questioned questionable spots.

That the pattern is wrong - that Mitchell Porter and I are both totally
off-base and J.R. Molloy is right - is rather less likely. Imagine you toss
back three physicists into the fourteenth century. Some alchemist spots them
arguing and decides to propound his own theory about a world built on five
mystical elements. The physicists may argue with each other, but they share a
single pattern, and the alchemist doesn't have it - and either you see it or
you don't.

> For myself I think one reason I question this so much is because it both
> is something that strongly appeals to me for several reasons and
> something that repels me simultaneously.

I'm not sure I can help you with that one, but it sounds to me like you're
making the decision a lot harder than it is.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:10 MST