Re: White Power Transhumanism?

From: Mike Lorrey (mlorrey@datamann.com)
Date: Wed Apr 10 2002 - 10:53:54 MDT


Rüdiger Koch wrote:
>
> Hmmm. The homepage has a symbol that looks like a variation of the Nazi
> symbol to me.

It's actually a twist on the yin/yang symbol, only with three commas
instead of just two. But I suppose if you stare at the clouds long
enough, one person can see Hitler in the same cloud others see Gandhi.

> They also seem to have a very elitist view on things:
> <QUOTE>
> So what's the moral of the story here? Well, make sure that you're one of the
> first Powers, obviously, but more on that later.
> </QUOTE>
>
> But that quote raises an interesting question I have never thought about yet:
>
> Could the Singularity happen as a paranoid arms race of augmentation devices
> where everybody tries to stay on top of the development because all those who
> fall back will not perform well enough to make enough money or so to get into
> the lext level? And whoever falls back too much will simply be trampled down?
>
> Hope you have some good points against that!

The only reason that one should consider the first Powers to be a
problem is if one has an inherently high level of distrust for their
fellow man, and a pessimistic view of humanity in general. That isn't an
extropic POV, but it is rather common, especially among europeans, and
considering the large disparity of capability an emergent SI would have
compared to a baseline human, it might not be terribly unreasonable to
seek to insure against inimical personalities attaining such power.
Eli's Freindly AI project is one answer to such a threat.

However, consider the situation logically: is humane behavior inherently
logical, rational, or is it irrational? If it is inherently logical or
rational (and considering the politics of individuals relative to their
intelligence, it is) then we really have nothing to fear from SI, unless
we are inhumane ourselves and fear being told by an SI to behave
ourselves.

It is for this reason that I consider most luddites against AI and SI
development to have inhernetly fascist agendas. They wouldn't fear these
technologies if they didn't fear being told to behave themselves, to
stop forcing others.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:22 MST