From: John Marlow (johnmarrek@yahoo.com)
Date: Sat Jan 06 2001 - 01:00:52 MST
Oh this is beautiful; how to make nanotechnology even
harder! I agree that such precautions could be taken.
Will they be, by all parties? Not likely. Limiting the
things in such ways will drastically reduce their
usefulness.
I also agree that nanotechnological weapons MAY be the
primary threat. After all--a weapon which is not
natural-environment-capable is useless. Weapons will
be designed to disassemble ALL matter, and they will
be able to operate in ALL environments. This is why I
used weapons as an example.
To be truthful, though, I really can't see how AIs
will improve things on the weapons front.
john marlow
Eliezer S. Yudkowsky wrote:
The problem of not accidentally releasing a
natural-environment-capable
replicator can be solved very easily by never
designing a
natural-environment-capable replicator, and, of
course, making very very
sure that replicators don't have the capability to
mutate. Build
replicators that incorporate yttrium and boron and can
only reproduce in
high vacuum at eighty degrees Kelvin using broadcast
power and broadcast
information, and an accidental spill won't make a
difference. Encrypting
the reproduction information - if you have it
on-board, which is itself a
mistake - is also easy; what you'd have to watch out
for would be prions,
structural deformations that result in similar
structural deformations in
offspring.
It's nanotechnological warfare, not the sheer
stupidity required for an
accidental error, that imposes the time limit on us
AIfolk.
__________________________________________________
Do You Yahoo!?
Yahoo! Photos - Share your holiday photos online!
http://photos.yahoo.com/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:34 MST