Re: NANO: Institutional Safety

From: David Blenkinsop (blenl@sk.sympatico.ca)
Date: Mon Nov 15 1999 - 22:04:48 MST


Earlier, Robert J. Bradbury wrote:

>
> David Blenkinsop <blenl@sk.sympatico.ca> wrote:
>
> > In the _Diaspora_ novel, the transhuman societies somehow maintain what
> > they think of as a wisely nonexponential or nonexpansive security
> > arrangement, where they leave enormous tracts of natural resources
> > completely untouched.
>
> There is a declining Return on Investment as you get bigger because you
> have increased communications delays and power costs. Until we adopt
> fundamentally different time scales for entertaining "thoughts" (weeks
> or months) it may not make sense to utilize all of the natural resources.
> (Sure you can turn all of the asteroids into VR simulations of "you",
> but what good does that do "you"?!? . . .

OK, check me if I'm wrong, but isn't this a bit like asking why any
group of people or any society would tend to expand based on accessible
resources? For one thing, there is a reproductive, or evolutionary
selection pressure in favor of those who both want to, and are
successful at, expanding. This may sound like circular logic, but that's
evolution for you! We should generally tend to see the success stories
of differential selection, rather than the ones that got left behind in
the race.

>
> > This despite the fact that they could very readily get
> > into colonial competition for settling those resources --
>
> The two main motivating forces I see for colonialism were:
> a) a desire for freedom -- but in a personal VR, you have the
> "ultimate" freedom.
> b) the quest for "rare" natural resources (e.g. gold, silver,
> spices, etc.) -- these aren't "rare" in a nanotech environment.

To this you should add

c) the desire for power, whether over other people or over the material
world in general

d) the desire to become influential in the sense of adding to the
success of the ideas, thoughts, or lifestyles that one values the most

e) the inadvertent, or preconscious tendency that humans share with
other living things, to create a certain "influence on history" for
those who happen to be evolutionarily successful

Note that "colonial" is often used to refer to the 19th century West's
empire building and global resource mining. IMO, the term could also be
appropriately used for any particularly intense modern competition over
disputed resources, especially if the resources are newly accessed, like
space resources or ocean resources. What occurs to me is that maybe it's
just politically incorrect to refer to current or future
territory/resource races as "colonial". A better term would be --?
Anyway, following my concern that future societies could get into
"colonial competition for settling resources", I also mentioned that new
resources might be employed by one side for "building an overwhelming
force of arms".

Robert J. Bradbury wrote:

>
> In a nanotech environment, the concept of an "overwhelming"
> force of arms is very questionable. You have to guarantee
> that you have disassembled *every* last little bit of nanotech
> in an enclave that can have berserker potential.

Well, by that reasoning we had better watch out that the representative
factions from every past war don't find some opportunity to poison our
breakfast cereal tomorrow morning! Seriously, in the relatively long run
scenarios that we are discussing here, active shield systems,
nanomedical upgrades, normal security arrangements, etc., are all
supposed to help prevent us from being anonymously murdered by some
random little bit of nanotech that someone hid away.

Where I think Robert Bradbury perhaps begins to answer my earlier
concern about exponential arms races is when he says:

> No, you understand it -- the physical stuff increases exponentially
> but that doesn't mean that "efficient" designs do or your ability
> to use it effectively does.

It may well be true that even a fairly advanced, replicable nanofactory
setup would have some definite limits in terms of what sheer volume of
manufactured weaponry would anyone find manageable or maintainable.

Also:

>
> Claim: The Resource Base grows so much faster than the population
> that there is no incentive to fight over the resources until
> uploads or AI arrives.
>

Ah, now we've got the whole exponential growth problem back again! Say
that our nanofactory building power seekers have their setup programmed
with sophisticated AI? Or, say that the nano-system software upgrades
itself so reliably that it might as well be a strong AI system in terms
of the control that it gives its builders? This won't happen tomorrow,
but it *may* happen as an outgrowth of true nanofactories, on the other
side of the Nano-Singularity, as it were. Same result if the builders
are keen on upload production in the name of pioneering a transhuman
society, such a society might well hit a biosphere killing exponential
growth curve.

In opposition to the concern that anyone would engage in an exponential
arms race, Rik van Riel wrote:

> Why? We've all seen that in the cold war there were 'enough'
> weapons to ensure mutual destruction. With nanotech that point
> will be reached much earlier and the 'enough' will be even
> more extreme. After that there shouldn't be any incentive to
> build more weapons (since there's no need for more).

As I recall, there was a lot of debate about how much was enough and
whether the other side might have a decisive *more*. Even in the absence
of AI, suppose we'd had quite capable nanofactories during the Cold War,
mightn't the mass of weapons have come out worse, maybe even *much*
worse?

Billy Brown wrote:

> No, you haven't missed anything. Nanotechnology makes it almost impossible
> for multiple sovereign powers to coexist on a single planet, because they
> will tend to find themselves forced into an exponential arms race that
> converts the entire mass of the planet into industry and weaponry. The fact
> that there is no room to deploy effective defenses against many high-tech
> weapons makes the problem even worse.

Optimistically, I tend to think that a combination of "active shield"
defense systems plus a strong, world level offensive deterrent, *should*
be able to dissuade anyone from kicking into "hyper" nanoweapons
production. Notice, though, that this is potentially a real problem,
also, it doesn't seem particularly easy to decide just when a
"difficult" state should be hit with a deterrent. For instance, do you
wait until there is evidence that State X has produced one whole
mountain load of weapons, or do you wait for two whole mountain loads,
is the deterrent one of bombing, or of disabling their tech somehow, and
so forth? Certainly, it looks to me as though we can't afford to ignore
such problems, even if it's true that the very worst scenarios require
particularly advanced technologies! In sum, I just don't know, how *do*
we prevent nano arms races -- or is maybe just as well to let such races
run their course, until everyone involved pulls back from the brink,
exhausted?

David Blenkinsop <blenl@sk.sympatico.ca>



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:47 MST