Re: Effective(?) AI Jail

From: James Higgins (jameshiggins@earthlink.net)
Date: Wed Jun 13 2001 - 18:22:48 MDT


Actually, I'm going to take a step back from my previous stance. I
honestly don't think an AI could corrupt ME over a VT100 (or similar)
communication even given a week. My previous post was more in the
abstract, in that I'm willing to bet a large percentage of the population
could be significantly influenced. Personally, I'm far to pessimistic and
paranoid to be significantly effected. Now, that is not to say that
the AI couldn't have a serious effect on my psyche. It could explain
things to me that would alter my outlook (such as Staring Into the
Singularity did). However, I don't think it could get me to release it
under any circumstances (other than having physical access to my
person). But it could most likely find SOMEONE it could convince to let it
out.

James Higgins

At 04:38 PM 6/13/2001 -0400, you wrote:
>Jimmy Wales wrote:
> >
> > Heck, I'll give you $1000 if I "decide to let you out".
> >
> > But, notice what I'm doing already. By making strong claims and
> > putting my money and reputation on the line, I increase my mental
> > resistance.
>
>Hm, an interesting rider. If I thought that you were on the verge of
>giving $1000 to SIAI in any case, as a worthwhile act of charity (which it
>is), I might be willing to add that as a rider to the challenge. As it
>is, however, I think that $1000 was probably chosen to be more than what
>you have to spare at the moment (is it?), which increases the strength of
>the challenge beyond what is realistic for poor ol' mortal me to overcome.
>
>As for that jab about "increasing your mental resistance", I don't want to
>reply and thereby effectively start the challenge on SL4. Let's save it
>for the chat. <grin>.
>
>-- -- -- -- --
>Eliezer S. Yudkowsky http://intelligence.org/
>Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT