From: Ben Goertzel (ben@webmind.com)
Date: Sun Feb 04 2001 - 20:24:45 MST
Typically, one distinguishes
-- evolution, which occurs in populations of entities
from
-- adaptation, which occurs in a single entity
It seems like what Christian is talking about is really adaptation, not
evolution.
There is a large CS literature on adaptive learning, which is powerful,
often competitive
with evolutionary learning.
Of course, adaptive learning, in general, also has the potential to go awry.
(Just like evolution.)
Eliezer's claim is that self-modification of a superhumanly intelligent
system is a special kind of
adaptive learning that's very unlikely to go seriously awry, if it's
initiated in the right way.
He has not proved this precisely, but, his argument seem plausible to me.
(My only remaining major disagreement with him has to do with whether
superhumanly intelligent systems
can be expected to retain a significant interest in humans or not.... But I
have not happened on any
really scientific way to resolve this at the moment.)
ben
> -----Original Message-----
> From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Sunday, February 04, 2001 9:12 PM
> To: sl4@sysopmind.com
> Subject: Re: Beyond evolution
>
>
> Discussing things at this level of abstraction is pointless. If you
> believe you've found a specific selection pressure that will necessarily
> produce specific behaviors in a singleton seed AI that undergoes
> successive rounds of self-modification, then say so, describe why, and
> explain how you are extending the phrase "selection pressure" to usefully
> apply in the absence of a population of replicators.
>
> Be concrete.
>
> Christian Weisgerber wrote:
> >
> > You can't outrun evolution. Not in this universe at least.
>
> How would you know? Humanity is a very, very young species.
>
> > It is a very fundamental principle, more fundamental than physical law.
>
> Evolution certainly is more fundamental than our physical laws, since you
> can get evolution under a variety of physical-law scenarios; however, all
> we know about evolution is that we find a lot of it in the absence of
> control by intelligence. This is not enough to make deductions about a
> technological world.
>
> > - A population of replicators.
> > - Mutation.
> > (In its widest meaning, i.e. some change to the replicators.)
> > - A fitness function.
> > - Limited survivability, typically by resource limitation.
> >
> > This may look like a lot of conditions, but good luck trying to
> > find circumstances where they don't apply.
>
> No population of replicators.
>
> > All you are suggesting with your "casting aside evolution" is a
> > replacement of the mutation mechanism, from random change to
> > engineered change by the replicators themselves.
>
> If mutations that lead to undesirable effects are deliberately excluded by
> an intelligent mutation mechanism, then selection pressures for that
> behavior are irrelevant - or, more accurately, the apparent selection
> pressures do not exist, since the fitness metric includes "survivability"
> under the observing mutation mechanism, as well as any trials in the
> external world.
>
> This is all we care about from the Friendly AI perspective, so whether or
> not you can really call it "evolution" is wholly irrelevant.
>
> > This may affect
> > the mutation speed, but it doesn't change one iota about the
> > applicability of the principle of evolution.
>
> Evolution has certain characteristics by which it can be recognized. If
> designed things look absolutely nothing like it, then it doesn't matter
> whether or not you've expanded "evolution" to describe everything in the
> Universe; it just means that you've expanded "evolution" to the point
> where it becomes useless.
>
> Evolution describes a specific subset of the Universe. It describes
> bacteria and people, but not stars (random) or surge protectors
> (designed). Calling something "evolved" is a useful statement, especially
> in Friendly AI, because it enables us to predict certain characteristics
> that appear in evolved things but not random things or designed things.
> If you mutate the term beyond its fitness, it will die.
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT