Steve Witham <sw@tiac.net> writes:
> Mmm. But there is that phrase from Popper, I think, "Letting our ideas
> die in our steads." The idea being something like, we can evolve faster,
> not by holding onto ideas as organisms hold onto their genes, dying or
> surviving-to-reproduce with them, but by letting our ideas die or survive
> and spread on their own... The funny thing is that until now I thought
> of that as a model for long life without obsolescence. But in this
> context it could be interpreted to mean that Gaia is more flexible if
> She doesn't hold onto us too tightly.
I think one could have a hierarchy of evolution here. More complex and
important systems shouldn't change too quickly, and these systems
would also want to retain their existence in some form. Lesser systems
aren't as conscious about existing and could be evolved much
faster. So our ideas can evolve at a quick pace, resulting in a
slightly slower memetic evolution (only the good or apparently good
ideas survive long enough to become memes), which in turn guides
genetic evolution which in turn changes the large scale biospheric
change.
But the lesson is clear: without some information loss
(i.e. selection) you cannot have improvement. The best you could do
would be to store all the unfit information in a static state, but
that would quickly swamp you.
> In general, I think we can survive AI if
> 1) As thoughts we are actually valuable, whether as ideas, skills,
> branches of alpha-beta trees, entertainments, curiosities, raw
> material for the subconscious, random number generators, or
> examples of autocatalytic sets of a certain size, AND,
> 2) We can convince AIs of this in time AND,
> 3) We can be converted to a form with a good benefit/cost ratio in
> time.
>
> My guess is that step 2 is the hardest.
I like the idea suggested in Brin's _Lungfish_: bring up the AIs as
our own children, and let them assimilate human ideas and values "with
their mother's milk" so to say. We don't need to do it literally as in
the short story, but if we create an "AI culture", a corpus of
experience and knowledge passed down among AI systems from the first
period of simple thinking towards the SI era, then individual systems
would have the chance to think about the arguments for humans based on
the experiences of the old "child" AIs.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !yReceived on Wed May 20 17:29:00 1998
This archive was generated by hypermail 2.1.8 : Tue Mar 07 2006 - 14:45:30 PST