From: Hungry Troll (hungrytroll1234@hotmail.com)
Date: Mon Sep 16 2002 - 02:06:57 MDT
I think the idea behind the "singularity" is that at some point we will
build
a machine smart enough to design and build smarter machines, which will in
turn be capable of desigining and building smarter machines, and so on.
Armed
with a poweful enough manufacturing capability, this process could indeed be
quite rapid.
In theory,If this does happen, it would almost certainly threaten life,
liberty, happiness, and almost anything else you can think of. Should the
machines described above decide to simply wipe out all human beings, there
would be little standing in their way. It might be best to proceed with a
bit
of caution.
---- This message was posted by Hungry Troll to the Extropians 2002 board on ExI BBS. <http://www.extropy.org/bbs/index.php?board=61;action=display;threadid=53176 >
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:17:04 MST