From: Charlie Stross (charlie@antipope.org)
Date: Fri Feb 11 2000 - 03:00:08 MST
On Thu, Feb 10, 2000 at 10:04:59AM -0800, hal@finney.org wrote:
>
> Following up on the crypto angle, there's also a technology called
> "group" or "threshold" signatures, which requires a group of people
> (or some subset) to work together to create a signature. Years ago I
> suggested the ultimate democracy, Anything Boxes which would only create
> objects which were signed by a threshold signature meaning that (say)
> 80% of the human race approved that design.
That's a good take on it. Or just a signature system that's open to
public authentication and relies on enough people agreeing that the source
to some new object is clean.
Trouble is, how do you make a system that can cope appropriately with
a group like, say, Aum Shinryko, membership in the tens of thousands, who
want to immanetize the eschaton (by building grey goo), without denying,
say, smaller groups like, say, Alcor, membership somewhat smaller,
who want to build their ressurect-o-mat(TM) (for all the corpsicles in
their freezer)?
This is a political issue as much as it is a cryptographic one. Access to
nanotechnology fabricators will be, to the next century, as fundamental
issue as access to the means of production appeared to be in the 19th
century.
> I think Drexler first proposed the idea of the "limited assembler" back
> in Engines, something which would make consumer and industrial devices
> but wouldn't make super-dangerous things. (Obviously even something as
> simple as a baseball bat can be used to club someone, unless it's a pretty
> damn smart bat.) Along with that he proposed a centimeter-sized isolation
> lab where you could build anything you want, but it couldn't get out. This
> is how new designs would be tested and approved.
And a good idea it is -- as long as there are no back doors in the system
other than fail-safe ones. (A back door that lets anyone fry the contents
of a test lab is fail-safe; it's annoying and may cost you money, or make
you vulnerable to denial of service, but it doesn't risk an outbreak. A
back door that lets you get things _out_ of the cell defeats the whole
purpose. And there will almost certainly be at least one idiot out there
who likes the idea of having access to whatever's in their pocket lab,
and who will _try_ to build or obtain a compromised one. Because people,
in the mass, are dumb.)
However.
I still believe that a mature nanotechnology that follows a closed-source
model is going to be hacked, left right and centre. The perceived benefits
of such hacking are too high, and the lack of flexibility in a system
that can only build designs from one source is going to be an irritant
to precisely the kind of low-budget imaginative amateurs who traditionally
provided both the best hackers and the seeds of garage start-up ventures.
Plus, a top-down security model is vulnerable to the top of the pyramid
being hijacked (for example, by an irresponsible government).
-- Charlie
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:26:44 MST