Re: Humanrintelligences' motivation (Was: Superintelligences' motivation)

From: Jim Legg (income@ihug.co.nz)
Date: Wed Jan 29 1997 - 18:27:27 MST


> think is the most dangerous thing about extropy, transhumanism, nanotech,
> replicators etc. And that is human goals and motivations.
> We don't need advanced AI or IA but just plain simple exponential growth
to
> give extremists and minorites acces to weaons of massive destruction.

I think most Uploaders would be happy to simply let the competition die off
naturally. The extremists and minorities who you claim want to use massive
weapons would only do so out of hopelessness because they haven't found a
better way. Educate them and while you're at it, what's IA?

Best,

Jim Legg http://homepages.ihug.co.nz/~income
Man * Soul / Computer = 12 ^ (I think therefore I surf)



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:06 MST