Re: SECURITY: Kaaza

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Wed Apr 03 2002 - 03:50:32 MST


On Wed, 3 Apr 2002, Samantha Atkins (responding to my concerns
about rogue AIs developing on 'private' DC networks) wrote:

> Why "not good" exactly? Getting to an SI is considered a
> general good, but preferably with more checks and balances
> toward Friendliness.

Though I'm sure you are aware I'm not a 'strong' fan of
the Sysop scenario (with nods towards Eliezer, the
Singularity Institute, et al if I'm munging fine
differences between "friendly" AI's and Sysops),
I think all would agree that an amoral AI could
potentially be *very* bad. (This isn't too different
from Fukuyama's objection that genetic engineering
might create an amoral human.)

> Several million computers without a workable plan of how to
> acheive a full AI and with the latencies implied between
> nodes is not particularly scary to me or likely
> to evolve something dangerous much faster than otherwise.

You may be more optimistic than I (and thats generally
a hard position to take). I expect that the latencies
will go down (I'm moderately certain that there are
bills circulating in the U.S. for fiber-to-home funding).
It becomes a question of whether a "workable plan" can
be implemented to take advantage of the fact that the
much of the required capacity is already installed.

To make it clear -- it looks like BDE has 16.2 million shares
outstanding -- according to the EDGAR 10-KSB:
http://www.edgar-online.com/bin/cobrand/finSys_main.asp?formfilename=0001011438-02-000252&x=29&y=11
At the recent yahoo price of $0.35 / shr (http://finance.yahoo.com/q?s=BDE&d=t
you could buy the company outright for less than 6 million dollars.
Assuming that Kazaa is installed on a base of 1 million machines
worth $1000 each -- buying out BDE gives you access to $1 billion
of computing capacity.

I happen to know people to whom $6 million is chump change
and who might look fondly on the idea of a subservient AI
whose primary purpose is to remake the world the way
they desire it to be.

The point would be that once the license agreement
has been "approved" and the software installed
(some /. comments suggested that you could never
"revoke" the public access to your resources under
the terms of the license agreement) you have sealed
your fate. So the resources are out there now and they
seem likely to increase. The bandwidth and computational
capacity seem to be available for whomever develops
a strategy for using those resources effectively.

More importantly -- you cannot rely on the ethics of
the people who have developed such systems not to use
them for purposes such as developing rogue AIs -- you
have to determine whether or not such systems are
indeed "unhackable". Unless Kazaa/BDE are much
smarter than Microsoft I would argue that that is
a very questionable assumption.

Who was it that said
 "There is no such thing as a free lunch"?

Robert



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:13 MST