Re: Professional intuitions

From: Damien R. Sullivan (phoenix@ugcs.caltech.edu)
Date: Fri Sep 25 1998 - 21:08:21 MDT


On Sep 25, 2:51pm, "Eliezer S. Yudkowsky" wrote:

> and he doesn't sound like a crank, he didn't say anything I know is false, so
> I have no choice but to trust his conclusions - to default to Drexler on the
> matter of nanotechnology.
 
Tangent:
I haven't read it yet, but most chemists I know laugh. So I default to a
neutral state: he may have some nice equations; they have actual experience
with reaction chemistry.

> Oh, I'm sure you understand AI! Enough to come up original ideas, for that
> matter. Still, you've got more powerful intuitions in social science. I
> daresay that you may even be leveraging your understanding of AI with your

Tangent: Robin studied physics from 1977 to 1984, and worked in AI from 1984
to 1993. *Then* he switched to economics. This is from his web site. I
don't know where he feels he has more powerful intuitions, but I wouldn't
assume they were in economics. Or that your intuitions are better than his,
at least based on experience.

> superintelligence and the future. It doesn't matter whether we both possess
> invent-level intelligence in the field, because our specialties are different.
 
Specialty? What's Robin's specialty? He's currently focused on econ, but he
spent half your life in AI.

> The Big Bang was instantaneous. Supernovas are very bright. State-vector
> reduction is sudden and discontinuous. A computer crashing loses all the
> memory at once. The thing is, human life can't survive in any of these areas,
> nor in a Singularity, so our intuitions don't deal with them. The Universe is

I'm not sure what analogy you're trying to make here. The one I'm
constructing out of these instances is "Yes, the universe is full of many
sudden phenomena. Big, simple, destructive phenomena. But the destruction of
the Singularity is a side-effect; what it *is* is a sudden increase in
complexity. A creative discontinuity, not a destructive one."

And a sudden jump in complexity is implausible. The closest analogies which
come to mind are crystallization and the probable spread of bacteria through
the early oceans. I'm not sure the former is valid; as for the latter, we
don't know for certain what happened, and there's a unique qualitative change
there: the rise of self-replicating entities on an otherwise dead planet.

Hmm. For judging suddenness of a new level the proper view is probably the
level just below. No fair judging the Industrial Revolution by dogs or
rocks, or the Singularity by uncontacted New Guinea highlanders. The rise and
spread of bacteria is unique because the previous level -- rocks -- had no
perception. Of course, if we take the previous level as liquid reaction time,
it probably took a while.

And so I'd disqualify the rise and spread of Homo sapiens; geologically a
blip, but for large mammals not that sudden. Certainly African and Eurasian
animals were able to cope, until we went to the next level or two.

So I'd say the Singularity idea is saying that the level of literate and
corporate humans will give rise to a level of self-improving AIs in a manner
which will seem discontinuous to that human level. Robin and I say "Yeah,
right." I don't think supernovae and crashing computers are good analogies to
get us to admit our intuition might be wrong.

> The phenomena I'm talking about would have destroyed life as we know it if
> they had appeared in any point in the past. How, exactly, am I supposed to
> find an analogy? It's like trying to find an analogy for Earth falling into

By finding a phenomenon that destroyed life as it knew it, while increasing
complexity. (Pedants: I know supernovae create elements for the rest of us,
but the complexity boost happens later, far away, not in the explosion.)

-xx- Twirlip of Greymist X-)

Death is for animals; immortality for gods. Technology is the means by
which we move from one state to the other.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:36 MST