RE: Fluffy bunnies and the FAI

From: Smigrodzki, Rafal (SmigrodzkiR@msx.upmc.edu)
Date: Thu Jun 13 2002 - 08:39:39 MDT


Eugen Leitl [mailto:eugen@leitl.org] wrote:

I don't have time for a proper analysis, but two major factors is
outlawing insecure systems (by making vendors liable) and breaking up
present monopolies.

### Yes and no. A breakup of monopolies is likely to produce the creative
chaos out of which the brightest inventions spring. You would need to
*enforce* certain monopolies.

-------

 Infrastructure for realtime traffic
analysis and hard network partitioning (sandboxing, watchdog cleansing,
strong cryptographic authentication, separate shutdown circuitry) needs to
be created.

### Agreed. But, most AI researchers say they want to develop AI locally,
rather than networked. Measures affecting the net will not slow down SAI
emergence, only its spread (minimally), when it's too late anyway.

-------

 Research in open literature and hardware purchases need to be
tracked. Competent AI researchers need to be tracked.

### I do not underestimate governmental ability to perform such actions.
Information might want to be free, say the digerati, but most of them have
never seen a real badass dictator, except on TV. You seem to imply that a
very intrusive and comprehensive control regimen needs to be initiated,
something much more than the silly little cryptography export prohibitions
which didn't work. Usually, such control is instituted at a significant
economic and societal cost. I doubt that the world's governments will act
quickly enough (within 5 years) and decisively enough (formation of a global
government, willingness to militarily destroy dissenting nations, full
surveillance of all sufficiently advanced computer tech). They could but
they won't because they are not scared enough. Convincing goverments to do
it will be only possible after it's too late, when a SAI announces they are
not govermnents anymore. You are even less likely to gain support for the
uploading of humanity before this event.

-------

If you think a loner can pull it off on a chunk of computronium while
leaving no traces you're being wildly optimistic. The risk is about the
same as truly successful (Gigadeaths) loner bioterrorist.

### Well, for now there is no risk but in 2020 when the desktop PC equals
the hardware in a human brain, and from the net you can download scads of
modules able to perform all kinds of tasks at the low and intermediate level
of cognition, even a bunch of garden-variety religious nuts might be able to
cobble together a god. By that time you'd need to have worldwide
surveillance of all PC's and all programmers, something that would require
unification with China and Russia, or a nuclear war with them. I say you
need political unification, or else the competitive forces between polities
will assure that well-funded military labs will build their superhuman
warriors long before the small-fry terrorists get their act together.

Is this the price you'd like to pay?

Rafal

PS. I wish you were right. I want to upload now but I am convinced it won't
happen this side of the SAI emergence.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:46 MST