Re: brains in bahrain

From: Charles Hixson (charleshixsn@earthlink.net)
Date: Mon Oct 14 2002 - 11:09:28 MDT


spike66 wrote:

> ...
> We have been imagining this for years.
>
> The more interesting question may be with regard
> to Eliezer's notions of inherent latency of a network.
> How much does it hurt the virtual supercomputer to need
> to distribute packets of information? Will a meganode
> gigahz virtual supercomputer outperform a traditional
> kilo-processor Deep Blue style supercomputer? Would
> it not be more entertaining to play supercomputer vs
> virtual metasupercomputer?
>
> spike

This is actually pretty straightforward. Whenever it's feasible, you
save over 30% by running as a single processor. As you increase the
number of processors, the overhead goes through the roof. So you
probably want to limit your cluster size to 8-16 processors to a
cluster. What you do to improve things at this point, is to give up on
fine level forking, and instead chunk your clusters. So you have O,
8-16 clusters chunked into a node. The clusters only transmit
realtively high level inquiries and commands between themselves. One of
them is dedicated to extracting pieces large enough to practically
parcel out to another cluster. Etc. This simple 8-16 way branching
tree is an oversimplification, of course, but that's the basic approach.
 Actually you will want to construct short-cuts between some clusters,
and some clusters will dedicate themselves to message forwarding, etc.
 (If you think about it, it's starting to look more and more like the
way the brain is organized... I wonder why.)

The tricky part is the fine level details, and the software that can
properly chunk and distribute problems. I would bet that that software
would need to be self-organizing, able to reconfigure itself to handle
the various nodes that it ended up in, and able to determine the next
goal to be addressed, and then act on it. Conscious? Intelligent? I
haven't heard good working definitions of those terms yet, so I don't know.

Notice that this description is given "as-if" the program is running on
bare hardware. I, personally, don't consider that a likely scenario. I
suspect that it will be running on top of a layer of one of the open
source OS's. Linux is a good choice, but there are others. The HURD
would be very interesting in this connection, if it finally gets ready.
 Or it might be designed so that it could run on any POSIX system. That
would let it migrate easily. But the version of the OS to follow is the
embedded version, not the popular distributions.

Anyway, this architecture would allow modules to participate whether
their connections to the rest were quick or slow, and whatever their
level of computational capability. But it might decide to give low
priority to queries by nodes that didn't contribute much. And I'm not
quite sure how it would manifest a "sense of self". It might well be
partially dependant on the quality of sensory capability available at
each node, as cognition doesn't seem to carry a "place" marker. If so,
this program would have a rather dispersed sense of self. Possibly it
would be similar to that which people feel for their family. The parts
would share most of their code, but many of their experiences would be
different. The distinction here is that it would be possible to make
perfect copies of memories of experience from one node to another. The
question might be, but why? The experiences of a baby-minder wouldn't
have much relevance to an international telecommunications hub. So they
would grow apart.

-- 
-- Charles Hixson
Gnu software that is free,
The best is yet to be.


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:17:32 MST