Cycles:
1A. Programmer writes a piece of code to do something.
2A. Code evolves into something small, dense and tight.
1B. Code leaps into new evolutionary space and discovers new algorithm.
2B. Programmer expands algorithm and generalizes it.
I think that evolutionary programming is mainly useful for three things. At a
low level of power and sophistication, it's good for optimizing pre-existing
programs. At a high level of sophistication, it can invent new algorithms.
And at a very high level, it acts as a symbiont to programmers. A programmer
hacks up an inefficient but general algorithm, and then the EP automatically
uses it as raw material and incorporates it into all existing programmings.
To put it another way, imagine that when somebody had invented the quicksort,
all programs in the world used the quicksort a few moments later. In effect,
this would let the programmer write code on an enormously higher level. As if
the language itself was sentient.
But we don't have that kind of EP yet.
> > Very, very true. A lot of people on this list seem to lack a deep-seated
> > faith in the innate perversity of the universe. I shudder to think what would
> > happen if they went up against a perverse Augmented human. Field mice under a
> > lawn mower.
>
> I don't think the universe is perverse. We just like to think it is,
> since it takes the blame :-)
I believe the universe is perverse - because that belief helps me outwit it.
> > "Who will guard the guardians?" - maybe nanotechnology would give us a perfect
> > lie detector. Nanotechnology in everyone's hands would be just like giving
> > every single human a complete set of "launch" buttons for the world's nuclear
> > weapons. Like it or not, nanotechnology cannot be widely and freely
> > distributed or it will end in holocaust. Nanotechnology will be controlled by
> > a single entity or a small group... just as nuclear weapons are today.
>
> It is this assumption I want to challenge. If it has the tremendous
> destructive potential you assume, it is a fairly logical assumption.
> But can you really back it up with some hard calculations?
How much time would it take for a nanomachine to construct a nuclear weapon?
I think we can assume that nano is at least as destructive as nuke.
> You might be worrying about an imaginary ultra-danger, which will
> suggest a course of action which is less than optimal but sounds
> plausible Remember that we humans consistently overestimate the risks
> of huge disasters and underestimate small, common disasters, and that
> fear is the best way of making people flock to an "obvious solution",
> especially if it is nicely authoritarian.
Okay, so I'll list my "Top ten ways to destroy all human life using only a few
thousand dollars worth of current technology." Or maybe not.
You'll just have to take my word on this one. We're too damn close to the
brink already. If nanotechnology can build cities, it can destroy them. If
nanotechnology can heal a broken arm, it can cause subtle lesions in the
prefrontal cortex and amygdala. Where nanotech is concerned, the trick isn't
taking over the world - it's doing it so that nobody else notices.
Remember that the Bad Guys aren't operating under the same restrictions as the
Good Guys. The wannabe dictators will use directed evolution on a scale no
Good Guy would dare to contemplate. And evolving mutually competing predators
will go a lot faster than mutually supporting immune systems.
Our immune systems are unimaginably more sophisticated than a virus or a
bacterium, using controlled evolution to combat natural evolution. And yet we
still suffer from colds and diseases. The only reason that the viruses
haven't killed us outright is that it isn't good strategy.
I want to repeat this, because it's important. Our immune systems are the
closest analogue to proposed nano-immunities. The mismatch in available power
and sophistication is enormous. Our immune systems learn from experience, use
controlled and directed evolution, have memories... everything but the ability
to consciously design things. And yet the viruses waltz casually through our
bodies, because it's so much easier to destroy than create. Sometimes a virus
which has deliberately evolved not to kill the host will be expelled from the
body after a few weeks. Lethal viruses who've taken the gloves off thrive for
years under continuous assualt by our immune systems and the designed
pharmaceuticals of science. And really lethal viruses, the truly destructive
ones, kill us directly.
Imagine a virus as sophisticated as the immune system. No contest. The sight
would be pathetic, or even moving if the immune system managed to stay on its
feet for a few milliseconds. And, thanks to the relative speeds of evolved
destruction and evolved protection, the viruses will actually be more sophisticated.
It is easier to destroy than create!
> I think you are partially right, nanotech will be dangerous, but we
> have to estimate the threat levels and what countermeasures that can
> be created before we jump to conclusions about future politics. For
> example, if decent immune systems can be created then the dishwasher
> goo scenario is unlikely, and if relatively few have the twisted
> genius and expertise to design Hollywood goo then it is a potential
> danger but with a likeliehood of occuring that is low enough for some
> planning to be done (like moving outwards, which ought to be feasible
> at the assumed tech level). We need to get some estimates of these
> factors.
Well... I'm not competent to estimate the percentage of the population with
the genius and expertise to design death goo. The "twisted" part can pretty
much be taken for granted. And I truly don't think death goo would be that
hard to design. If any human is even capable of designing an immune system,
then the average educated person will be capable of breaking it, given time
and effort. Any twisted genius will go through it like tissue paper.
As with nuclear weapons, the issue is really quite simple. This is in
occasion where the destructive power of X is not diminished by spreading it
among more hands. Military and governmental power is diminished by
distribution; hence the U.S. government. Nuclear power is diminished by MAD;
any number of Nuclear Powers greater than two decreases stability. The
situation will be pretty much the same with nanotechnology... except that a
first strike will have a different probability of succeeding. If that
probability is high enough, MAD won't work and nano should be confined to a
*single* group.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.