Re: fleeing nano, bio, nuke hazards

From: Samantha Atkins (samantha@objectent.com)
Date: Wed Jul 10 2002 - 00:53:14 MDT


spike66 wrote:
> Back a few weeks ago when we were talking about attempting
> to cross interstellar space with current technology, we were
> derailed in debating the possibility of outrunning a singularity. It
> never entered the discussion that there are plenty of dangers
> faced by humanity besides a singularity, which might actually be
> more effective in scaring a dozen Gatesian fortunes into being
> dedicated to the long-shot attempt.
>
> The reason the other dangers might be more effective is that
> there might be a great majority of capital holders who are not
> convinced that the singularity poses a real risk, or that even if
> a singularity is possible, it should not be avoided. Yet we
> can all see that the danger of nuclear, bio and even nanotech
> warfare and terrorism is ever increasing. (The Palestinians
> and Israelis are trying to start the next world war, for instance. They
> have been successful in drawing even peaceful-minded
> extropians and transhumanists into their conflict, as many here
> have chosen one side or the other. Dont give in to that.)
>

You must be joking. Israel has acted in ways that we and the UN
would jump all over if it was any other country (other than
ourselves). But that conflict is incapable of starting a world
war. At least it is highly unlikely.

I haven't noticed that transhumanists and extropians are
particularly peaceful-minded. It seems to run the gamut. Is it
extropian to put our head in the sand over situations in the
world? Why shouldn't extropians have opinions about such
things? If extropian/transhumanist viewpoints have good general
validity and are worthwhile memetic constructs they will provide
useful POVs and considerations for various world events and for
how we should move forward in these situations also, or so it
seems to me.

> If something *really* scary happens, such as a nanotech terrorist
> attack, then the big money may conclude that it is only a matter
> of time before gray goo devours the planet, and furthermore there
> is every reason to believe a spaceship could outrun that danger.
> The big money may conclude that humanity has not the luxury
> of waiting for uploading, but rather must act right away.

A nanotech terrorist attack before we even have reasonably
working nanotech? If it is afterward then there is a nano arms
race. In the short run a bio attack or a few pony nukes is about
the worse that I would expect. I don't think either one of
these would run big money off-planet unless it was a very, very
nasty, well-targeted and well-deployed bio-attack. Then I doubt
you could get sufficient ships and material built and off-planet
quickly enough.

> Given a dozen Gates-loads of money and present tech, what
> would you estimate are the chances of successfully crossing to
> the next star and surviving long term? spike
>

Almost zero. Assuming we can actually do the trip ok, which is
by no means currently certain, our long term survival would
depend tremendously on what is at the other end. If we are
going to be living in space indefinitely I don't believe we know
enough at this time and can carry enough with us to pull that
off. Nor do I believe we can take enough with us to set up what
would be needed to built longer term habitat at the other end.
If we were unfortunate enough to get to a system with only a
couple of gas giants and hardly any small bodies we would be
pretty much screwed. It would make more sense to set up camp in
the asteroids belt of our own system until much more capability
and know-how was developed and perhaps the situation on earth
had cooled down a bit.

Best of all, in my humble opinion, is to master enough of the
messy human stuff to greatly lower the odds of massive terrorism
or uncontained regional blow-ups in the first place and to raise
the odds of a lot of people on your side who act as buffers,
detectors and exact swift avengers if such an act is committed.
  I don't think running away is the answer here.

- samantha



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:15:17 MST