Re: Singurapture

From: John Marlow (johnmarlow@gmx.net)
Date: Mon Apr 30 2001 - 22:52:42 MDT


You're sooo serious.
I did. Your meaning was also clear from your excellent choice of
words--subgoal, supergoal (which by implication, even without reading
linked material, takes precedence over subgoals).

jm

On 1 May 2001, at 0:21, Eliezer S. Yudkowsky wrote:

> John Marlow wrote:
> >
> > Indeed it is! 'Twas an attempt to make a point humorously: The only
> > way to get rid of some problems is to get rid of us. We're buggy.
> >
> > On 30 Apr 2001, at 3:56, Eliezer S. Yudkowsky wrote:
> > >
> > > Why, look, it's a subgoal stomping on a supergoal.
> > >
> > > http://singinst.org/CaTAI/friendly/design/generic.html#stomp
> > > http://singinst.org/CaTAI/friendly/info/indexfaq.html#q_2.12
> > >
> > > See also "Riemann Hypothesis Catastrophe" in the glossary:
> > >
> > > http://singinst.org/CaTAI/meta/glossary.html
>
> John, would you *please* actually follow the links? You are flogging a
> dead horse. A subgoal stomp - in particular, a Riemann Hypothesis
> Catastrophe, which is what you've just finished reinventing - is not a
> plausible failure of Friendliness for the class of architectures described
> in "Friendly AI".
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>

John Marlow



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:23 MST