Re: (meta-)meta-design (was: Coding a Transhuman AI 2.0a)

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Tue May 23 2000 - 16:28:27 MDT


Anders Sandberg writes:
>
> The problems with evolving GAs in my experience (I have played a
> little bit with it, and evolving neural networks and learning rules
> for them) is that in general convergence is slow. Fitness is a

Yes, so let's use evolvable hardware.

> stochastic variable you need to find the expectation of when you try
> to evaluate the fitness of the meta-population, but that takes a lot
> of repeat evaluations - which is the costly part. Without getting rid
> of the noise the meta-level GAs will instead have a very random
> fitness landscape, and convergence seems uncertain.
 
Of course, evolutionary algorithms are a superset of what we're using
today. The mutation and crossover function as well as the nature of
encoding of the system itself is not modified in the course of a
conventional GA. Such restrictions are ad hoc, and prevent the
modification of the algorithmic framework itself, which guarantees
that the algorithm is and remains suboptimal.

> So if you have a population of N individuals in the meta-population
> and M individuals in the population and do k evaluations for each
> individual in the population, you have kNM evaluations to do per
> generation in the meta-population. Doing it on higher meta-levels is
> even worse, even small populations quickly become very slow. Time to
> start running clusters...

Indeed. Clusters, and EHW.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:28:47 MST