Re: 5,000,000,000 transhumans?

From: Max More (maxmore@primenet.com)
Date: Sun Aug 09 1998 - 13:09:06 MDT


At 01:37 PM 8/9/98 +0200, Den Otter wrote:
>
>Yes, but this is fundamentally different; godhood isn't something
>that one would sell (or give away) like one would do with minor
>technological advances such as phones, TVs cars etc. Just like nukes
>were (and are) only for a select few, so will hyperintelligence,
>nanotech, uploading etc. initially be only available to a select
>group, which will most likely use them to become gods. There is
>no rational reason to distribute this kind of power once you have
>it.
>
>Powerful businessmen still need others to make and buy their products,
>and dictators and presidents still need their people to stay in power
>& to keep the country running, but a SI needs NO-ONE, it's
>supremely autonomous. I can't imagine why it would share its
>awesome power with creatures that are horribly primitive from its point
>of view. Would *we* uplift ants/mice/dogs/monkeys to rule the world
>as our equals? I think not.

"No rational reason" is a strong claim. I doubt your claim. First, your
view surely depends on a Singularitarian view that superintelligence will
come all at once, with those achieving it pulling vastly far away from
everyone else. I don't expect things to work out that way. I've explained
some of my thinking in the upcoming Singularity feature that Robin Hanson
is putting together for Extropy Online.

Second, I also doubt that the superintelligence scenario is so radically
different from today's powerful business people. [I don't say "businessmen"
since this promotes an unfortunate assumption about gender and business.]
You could just as well say that today's extremely wealthy and powerful
business should have no need to benefit poor people. Yet, here we have oil
companies building hospitals and providing income in central Africa. I just
don't buy the idea that each single SI will do everyone alone.
Specialization and division of labor will still apply. at some SI's will
want to help the poor humans upgrade because that will mean adding to the
pool of superintelligences with different points of view and different
interests.

Let me put it this way: I'm pretty sure your view is incorrect, because I
expect to be one of the first superintelligences, and I intend to uplift
others. Or, are you planning on trying to stop me from bringing new members
into the elite club of SIs?

> In any case, we
>should all work hard to be among the first SIs, that's the only
>reasonably sure way to live long and prosper.

No disagreement there. Make money, invest it, and keep on integrating
advances as they happen.

Max
 
--------------------------------------------------------------------------
Max More, Ph.D.
more@extropy.org (soon also: <max@maxmore.com>)

http://www.primenet.com/~maxmore
Consulting services on the impact of advanced technologies
President, Extropy Institute:
exi-info@extropy.org, http://www.extropy.org
--------------------------------------------------------------------------



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:26 MST