From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Sep 17 1999 - 08:58:41 MDT
Robin Hanson wrote:
>
> The fear is usually more of a small lab or county taking over, instead
> of a few countries. I find it hard to take seriously as a concern,
> but others on this list most clearly do.
I would also expect that the minimum size for a sudden takeover would be
a small country, a very large lab (major corporation), or more likely a
large country. I wouldn't expect a small lab to be capable of it,
unless AI has advanced to the point of semiautomated design by the time
of the assembler breakthrough.
Note that if the assembler breakthrough does occur in a small lab, then
we have a *real* problem, because more than one large power might get
ahold of the technology. "One person can take sips from a tidal wave; a
thousand people, or two people in competition, cannot..."
> >Fortunately, the step by step nature of technology advance *does* tend
> >to argue against the likelihood of an easy or thorough global coup.
> >Actually, to be a bit more precise about it, surely the halting, trial
> >and error process of technology argues against a global coup done solely
> >or even primarily through tech advance!
>
> Yes, that is my intuition. But the question I was facing was why
> others have such dramatically different intuitions.
I think it's because of our visualization of the technology. For
example, no matter how crude nanotechnology is, once you can infiltrate
NBC security and cause small explosions in a precisely defined
locations, then that's enough to conquer most countries.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:11 MST