From: Anders Sandberg (asa@nada.kth.se)
Date: Tue Jun 18 2002 - 04:01:02 MDT
On Sun, Jun 16, 2002 at 09:56:06PM -0700, Samantha Atkins wrote:
>
> A fast transformation is certainly more quickly dangerous if we
> assume that a slower transformation is viable. A sufficiently
> slow transformation could have personal consequences like being
> too slow to insure or make quite likely our own personal survival.
This is true. This is of course something driving us to favor fast
transformation in our discussions. But if we are intellectually honest we
must distinguish between what is desirable and what is true.
Fast transformation scenarios tend to be very inhomogeneous. A small
subset of the world rushes away, and differences increase exponentially.
This produces disparities that are likely sources of aggression. Slower
transformations have the advantage that the continuity of the
economic/social web is not broken, and various self-organizing principles
like technology diffusion, comparative advantage and evolution of legal
systems have the time to take place. I don't think there is any reason to
strive for strict equality: there is room for diversity and a vast range
of futures as long as frameworks of mutual coexistence have the chance to
evolve.
> If we do think a slower transformation is required to insure
> reasonable survivability, what do we do if the technology
> ramp-up looks to be moving faster than that? Do we actually
> advocate policies to slow it down?
I think we need policies to enable better fielding of technologies. These
policies doesn't have to be top-down laws, they could just as well be in
the form of insurance. If you have to pay for the risks you induce in
others by insurance premiums, then very risky development will be done
more carefully, or moved elsewhere like in space. In many cases I think
we actually need to help technology to advance more freely than faster:
we need a broader range of options to test and choose from. This also
gives us more data to build later decisions on.
> >I wonder if the singularity really ends the window of vulnerability.
> >Maybe it just remains, giving whatever superintelligences are around
> >nervous ticks.
>
> If we go into it with the notion that continued darwinian
> selection of the most agressive, fastest, smartest is the way it
> should be then it is almost inescapable that there is a
> continuing windo of vulnerability.
Well, the picture of nature red in tooth and claw is popular, but clearly
it isn't the whole picture. There is a lot of coevolution and
ecosystembuilding, little understood at this point. And as Drexler
pointed out in one of his agorics papers, human economic interactions are
far more recpirocally altruist than anything seen in evolution. Maybe the
trend of pre- and postsingularity societies will instead be in this
direction. That doesn't close the windown of vulnerability, but it
narrows the focus onto certain kinds of existential risks like irrational
players, one-shot mistakes and malign self-organisation.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:53 MST