From: Dan Clemmensen (dgc@cox.rr.com)
Date: Wed Feb 13 2002 - 18:06:03 MST
Eugene Leitl wrote:
> Dan, made my reply any sense?
>
> On Mon, 11 Feb 2002, Dan Clemmensen wrote:
>
>
>>I mentioned that the best routing algorithms, when compared to primitive
>>routing algorithms, can at best double the utilization of a large data
>>network. A shy lurker e-mailed me privately and asked if there is an
>>analogy with other algorithms, and in particular with AI algorithms. I
>>think that this is a good question.
>>
>>Basically, to a first approximation, you can get the same performance
>>by over-engineering a data network by a factor of two as you can get
>>with the most clever routing algorithm (in practice, not in theory.)
>>
>>This leads to two questions:
>> 1) Is any part of the AI problem isomorphic to the routing problem?
>> 2) What parts of the AI problem, if any, are amenable to brute-force
>>over-engineering, and what is a good guesstimate of the "cleverness
>>factor" of a clever algorithm versus a brute-force algorithm? The
>>cleverness factor for routing is a simple factor of 2.
>>
>>As an aside, I mentioned to our "shy lurker" that this is exactly the
>>kind of question we like to see posted onto the list. I hope our lurker
>>will begin posting.
>>
It made so much sense that I didn't bother to reply. :-)
If you control the physical network topology you can improve the
routing dramatically, of course. The internet is anarchy, and the
it's routing is designed and implemented chaotically. It still manages
to function fairly well at 50% utilization, so my argument stands.
We all know that we can get from 50% to not more than 90% with clever
routing, so routing is not an interesting accelerator. To become more
efficient, we need to employ tricks like caching and task migration
that minimize the need for bandwidth. Again, this mostly requires the
net to be under more centralized control.
I'm attracted to the Internet as an analogy for emergent behavior
precisely because of its decentralized control. The Internet is a
collection independent networks that use frightenly informal agreements
about how to route traffic. They base these agreements on poorly
specified protocols that are loosely documented by an informal
organization known as the IETF. Each of the networks is cobbled together
from routers from vendors that don't particularly like each other.
networks frequently use multi-layer "overlay" techniques such as IP
over ATM wherein the layers react to each other in unforeseen ways.
This entire mess manages to provide good service to most of the users
most of the time while growing at a phenomenal rate. The growth of the
internet is not constrained by this chaos. It is constrained by the
"last mile." The last mile is provided by entities (mostly telephone
companies) that adhere to a more structure, controlled, and centralized
model.
How does this relate to progress in AI? Well I hope it demonstrated that
there is no need for central planning. Instead, the Internet is the most
most complex, expensive, biggest, and fastest growing entity humanity
has ever created, and we did it by accident and without central control
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:21 MST