From: Max More (max@maxmore.com)
Date: Tue May 08 2001 - 11:02:44 MDT
At 02:02 AM 5/8/01, Anders wrote:
>I consider Warwicks speculations combined with his media savy a quite
>worrying threat to transhumanism. The risk is that he manages to do
>spectacular stunts, drawing attention to his ideas about the imminent
>cybercalypse, helping spread an anti-AI sentiment ("terrestrialism" in the
>terms of Hugo de Garis, who is doing nearly the same thing). The problem here
>is that he has not really given much thought to likely development paths and
>forms of human-AI interaction. This is why the E5 panels about this issue are
>so important: we better do the analysis and start a more serious debate on
>how to make AI social than professor Warwick does.
Anders, I agree completely on the disturbing message that Warwick is
pushing (without any hint of alternative possibilities that I have seen).
The Saturday morning Extro-5 discussions *will* be important in countering
this. Apart from the press present at the event, we need to put together a
statement following the event (on this and other issues) presenting
alternative views.
The Extro-5 presenters will not all be in agreement, but any announcement
of our discussions can show the alternatives to the nightmare
human-annihilation scenario that Warwick, de Garis and some others are
drawing as the only picture.
Onward!
Max
_______________________________________________________
Max More, Ph.D.
max@maxmore.com or more@extropy.org
http://www.maxmore.com
President, Extropy Institute. http://www.extropy.org
Senior Content Architect, ManyWorlds Inc.: http://www.manyworlds.com
Chair, Extro-5: Shaping Things to Come, http://www.extropy.org/ex5/extro5.htm
_______________________________________________________
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:32 MST