BOOK: _Emergence_

From: J. R. Molloy (jr@shasta.com)
Date: Tue Nov 13 2001 - 05:04:44 MST


Of Slime Mold and Software
http://www.prospect.org/print/V12/20/blume-h.html
Emergence: The Connected Lives of Ants, Brains, Cities, and Software
By Steven Johnson. Scribner, 288 pages, $25.00

It's easy to see why there aren't more books like Steven Johnson's Emergence:
Only Johnson knows how to write them. Johnson was a founder and editor of
Feed, one of the Web's first and best "zines" (now moribund, unfortunately,
thanks to the economic downturn). Feed aimed to show that there was no
contradiction between maintaining high literary standards and creating online
community, and it succeeded, in large part, because Johnson himself is both
media savvy and a skilled writer. Johnson's first book, Interface Culture
(1997), was probably the single most memorable volume to come out of the
Internet explosion of the 1990s. It was an intellectually bold, often
exhilarating read, full of unexpected perspectives on culture and digital
media. The new book is exactly the volume Johnson needed to write next. It
takes up the earlier arguments, expands them in the light of a body of
fascinating material, and winds up being, if anything, still more of a tour de
force.

You get a feeling for what's to come from the book's first page, a diagram of
a human brain positioned above a map of Hamburg. The two shapes are strikingly
similar, which, as Johnson sets out to show, is no accident. Nature and
society overflow with correspondences. To see, from Johnson's viewpoint, how
such correspondences come about and how we have learned to recognize and,
lately, make use of them entails a survey of ant life; Roman settlements;
subways; Manchester England; the early work of Friedrich Engels; the career of
Alan Turing; ants again; cybernetics; and video games--with a stop in there
for how the "shrieking of the demons" in John Milton's Paradise Lost spurred
one complexity theorist on to a major insight.

Johnson defines emergence as "the movement from low-level rules to high-level
sophistication." Emergent systems are "not top-down. They get their smarts
from below." Like the emergent systems it describes, the book starts simply,
with a discussion of the one-celled creatures known as Dictyostelium
discoideum, or slime mold. Johnson doesn't just begin with slime mold; he
comes back to it again and again. If the book could be said to have a main
character, slime mold would have to be it. To grasp the workings of
Dictyostelium discoideum is, as Johnson takes no small delight in affirming,
to effect a revolution in thinking. It is to finally correct the mental error
that has for so long kept us from understanding why birds flock and liver
cells don't turn into spleen cells. Johnson calls that error the "pacemaker
hypothesis." The hypothesis assumes that complex behavior depends on some form
of centralized authority. In the case of ants, it would be that the colony is
governed by the queen. In the case of slime mold, it would be that the
decision to cohere as a swarm is triggered by the dictates of alpha slime
cells.

Both assumptions couldn't be further from the truth. The ant queen is kept
literally in the dark, way underground, far from the decisions about foraging
and fighting in which she plays no part. Similarly with slime cells: Their
trademark behavior does consist of assembling into a swarm that crawls "across
the garden floor, consuming rotting leaves and wood as it moves about," only
to disintegrate back into individual units when conditions change. Slime mold
manages this trick--what complexity theorists call a "phase change"--without
benefit of a higher authority. "All slime mold cells," as researchers were
puzzled to discover, "are created equal." They also equally obey certain
rules. And they monitor one another's behavior constantly. Therefore, when the
weather gets cool and food becomes abundant, each cell notices that the one
and only slimy thing to do is to swarm suddenly. What it comes down to in the
molecular fine print is a chemical that the cells emit. The more a cell senses
that chemical, the more heartily it contributes to the chemical chorus. And
what sufficient quantities of that chemical say in terms no slime cell can
resist is: "Swarm."

Self-organizing behavior is no stranger to scientists today. But in 1969, when
Evelyn Fox Keller and Lee A. Segel put it forward as an explanation for slime
swarming, biologists were mystified. The idea of complexity resulting from
bottom-up interactions had been explored in urban planning by Jane Jacobs and
in mathematics by Alan Turing, among others. In fact, Turing's late musings
about how complex organisms could arise from simple cells jogged Keller into
conceiving the new slime paradigm. But for the most part, biologists remained
prejudiced in favor of cellular authority. For them, complex behavior was an
orchestra led by its conductor rather than, say, a string quartet whose
members listen to one another well. It was no small matter that biologists
were stuck on pacemakers; it meant they really didn't have a handle on how the
governing idea of their discipline--Darwinism--actually worked. In a sense, it
was as if they held back from admitting the full implication of natural
selection: namely, that it requires no outside agitators, no presiding
spirits, no--as the philosopher Daniel Dennett puts it--skyhooks. In fact,
evolution is the best possible illustration of what Johnson means when he
writes about "complex adaptive systems that display emergent behavior.";
Evolution is the showcase example of simple things becoming more complex, dull
things becoming smarter under pressure to survive.

It took a while, but biologists caught on. It didn't hurt that computer
scientists soon learned to model adaptive behavior on-screen. One of the more
amazing stories that Johnson tells concerns Danny Hillis's experiment with
genetic algorithms. Unlike traditional computer programs, which are static,
genetic algorithms change over time. The ones best suited to solving a given
problem are allowed to survive and, in effect, to reproduce; the others, in a
digital simulation of extinction, are deleted. Hillis's idea was to set some
undistinguished bits of software loose on the problem of sorting data. Sorting
data--in alphabetic order or by any other index--is a task that takes even
computers a lot of time when the data set is large enough. Not surprisingly,
computer scientists have devoted considerable energy to devising the most
efficient sort, the one that can get the job done with the fewest computer
instructions. In the Hillis experiment, programs that managed the job best
passed their code on to a next generation, after being subjected to a bit of
random mutation (à la cosmic rays) and code-swapping (à la sex) with other
promising miniprograms. Hillis, a founder of Thinking Machines Corporation,
had the company's supercomputers at his disposal and was able to churn through
thousands of evolutionary cycles in mere seconds.

Hillis found that the most efficient routine achieved a 72-step sort, as
compared with the 60 steps required by the fastest human-coded algorithms. Not
satisfied with this gap, however, he introduced predators into the simulation.
Predators didn't allow programs to rest on 72-step laurels; they forced the
sorting routines to risk getting slower on the chance of getting faster. (As
the poet Rainer Maria Rilke has observed, "abiding is nowhere.") With
predators at their heels, some routines drove on to a 62-step sort. The
astonishing thing about those 62-steps was that Danny Hillis couldn't make
heads or tails of them. Hillis had figured out how to link up 64,000 cheap
microprocessors so that running in parallel they were competitive with the
best supercomputers Cray or Fujitsu had to offer. But how a mere 62 lines of
code whose function was already known did their job remained mysterious to
him. The simulation seemed to have goosed into being an utterly alien approach
to problem solving.

So much for the old adage that a program can do only what its programmers tell
it to do. From here on, in Johnson's view, programs that are so constrained
will be considered the dullards of software. The really interesting problems
will be treated to genetic algorithms. Programmers will adopt the Hillis
method: "Mix, mutate, evaluate, repeat." And be amazed. This is how, for
example, in 1999 the computer scientist Marco Dorigo approached the
notoriously difficult "traveling salesmen" problem. What is the shortest route
between n stops that goes through each of them once and once only? When n gets
large enough, as happens on telephone and computer networks, there's no
surefire answer, only educated guesswork. Dorigo found that genetic algorithms
guess better than humans can. Johnson reports that France Telecom, British
Telecommunications, and MCI now let emergent software solve their routing
problems.

Johnson calls such applications examples of artificial emergence and thinks
that we are on the threshold of an age that will be defined by them. We now
not only understand the ways of slime mold, we can simulate them; and beyond
that, we can harvest the results. In Johnson's view, the influence of
artificial emergence is or soon will be making itself felt not only on
software but in politics, mass media, corporate structure, and the shape of
the movement against globalization. In phrases calculated to set off echoes,
he writes: "Up to now, the philosophers of emergence have struggled to
interpret the world. But they are now starting to change it."

Johnson's account of the prehistory of emergence--how it itself has emerged
from urban planning, entomology, information theory, cell biology,
neuroscience, and other disciplines to became a discipline of its own--is a
masterful piece of storytelling, but only a sort of warm-up for him.
Artificial emergence is his real terrain. Feed's experiments with interface
design and online community gave him experience not only in studying but in
shaping it. The author's experience as a maker of the phenomena he describes
distinguishes Emergence from a book like James Gleick's Chaos: Making a New
Science, to which it in other respects bears comparison.

Johnson's discussion of artificial emergence elaborates on points he made in
Interface Culture. There he had claimed great things for the computer
interface: It was about to became the platform for a twentieth-first century
avant-garde that would filter and interpret reality for us as novelists did
for the nineteenth century. The "Victorians," he wrote, "had writers like
Dickens to ease them through the technological revolutions of the industrial
age." The computer interface would similarly guide us through "the virtual
cities of the twenty-first century."

In the years since Interface Culture, I, for one, have wondered if this
prediction was another of those flights of optimistic fancy, like the notion
that the Dow Jones average would plateau at 36,000, that had best be forgotten
in the light of new realities. After all, despite hits from the Justice
Department, Microsoft Windows still straddles the desktop like an ancien
régime. No sign of emergence there: Microsoft is one of the places the
pacemaker hypothesis holds true. And not even the most fervent supporters of
the open-source alternative to Windows would count the computer interface as
among it strengths. Where, then, is the Johnsonian avant-garde?

As Emergence makes abundantly obvious, it is flourishing in the world of
games, such as SimCity and its sequel The Sims, and in the articulation of
online communities, such as Slashdot. Lest video games be derided as
inconsequential as compared to Windows and the novel, Johnson stresses the
opposite view: Video games are poised to replace pornography itself as the
leading engine of technological development. "Because part of their appeal
lies in their promise of new experiences," he writes, "and because their
audience is willing to scale formidable learning curves in pursuit of those
new experiences, games often showcase cutting-edge technology before the tech
makes its way over to the red-light district. Certainly that has been the case
with emergent software."

In Emergence, Johnson has put some powerful ideas through a literary feedback
loop that will, in all likelihood, accelerate and magnify their effect on our
culture. Does he occasionally display some of the tics and tremors that come
with believing you are hot on the trail of a theory of everything? Absolutely.
He thinks emergent software, and not, say, more public transportation, is the
solution to traffic jams. "Make the traffic lights smart," he writes. "You can
conquer gridlock by making the grid itself smart." This is not the
technorealism Johnson was once known for espousing as a middle road between
technophiles and technophobes. Nor is it necessarily technorealistic to
imagine that Bach-like music composed by emergent code will lack for nothing
to seem to us as "sweet" as Bach's own works. But these aberrations or, in the
second case, serious complications, which call for more reflection, in no way
compromise the value of the book.

Interface Culture gave me food for thought for years. I expect the same from
Emergence. I keep coming back to that sorting routine Danny Hillis couldn't
parse. That was only 62 lines of code. What 10-million-line piece of emergent
intelligence now slouches toward a compiler to be born? Will it be a rough
beast at least vaguely familiar to us? In other words, will it illuminate an
aspect of intelligence latent in the human brain? Or will it prove to be as
alien and inscrutable as a Hillis sort?

--- --- --- --- ---

Useless hypotheses, etc.:
 consciousness, phlogiston, philosophy, vitalism, mind, free will, qualia,
analog computing, cultural relativism, GAC, Cyc, Eliza, cryonics, individual
uniqueness, ego, human values, scientific relinquishment, malevolent AI,
non-sensory experience

We move into a better future in proportion as science displaces superstition.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:11:57 MST