Parallel vs Serial Computing [was Re: Jaron Lanier Got Up My Shnoz on AI]

From: Ken Clements (Ken@Innovation-On-Demand.com)
Date: Wed Jan 16 2002 - 15:13:26 MST


This is mostly a non-issue from a scientific standpoint, but is strongly
influenced by economic and manufacturing technology constraints. Back in the
days when the fastest computers were hand-made by the Ladies of Chippewa Falls,
all computing was expensive and it was generally true that a computer with a
price tag five times higher was more than five times faster. Thus, it was
better to get the faster serial machine than to try to put together groups of
slower machines.

Things changed when the single chip beat anything you could put together form
logic parts, in both speed and cost. So now, at a price less than most cars,
you can buy the fastest processor that can be made. Having reached this point,
although it would be better to have a processor five times faster than five of
these, there is no such thing, so to get the performance advance one has to go
to parallel arrays or clusters.

I usually advise folks working on research to pick a problem that will fit on
today's single processor. Then, after spending a couple of years getting to
understand what they are doing, buy the next computer. IMHO a current P4 box
has the processing power to do significant AI, if we just knew how to write, or
grow, the code. As John C. pointed out, if you want it to be self-aware, be
sure to put in that code module.

Special purpose hardware for computing has usually been an economic failure.
This is because of the exponential increase in general computing power allowing
folks to do in software today that which special hardware did two years ago.
Over the years I have watched this happen in several areas, but the first time
was back in the early 70's. I was working at a company that did image
processing, and had developed a special parallel logic processor that could do
high speed operations on pictures, it was called the Binary Image Processor, or
BIP. The BIP was the size of a refrigerator and made of first generation high
power TTL, and cost a great deal of money to develop and build. It was quite
useful in optical character recognition, which is where the company expected to
get a market. (We liked it because you could run 100 x 100 Conway's Life on it
and watch it in fast real time.)

 However, to use the BIP you needed a PDP-10 to talk to it, and associated
scanners and displays, so it was all very expensive. One day, a guy came over
from marketing and showed us a print out from a potential customer where someone
had written a BIP simulation from our published papers, and in software, their
CDC-7600 could out BIP the BIP. (sigh)

I started out in hardware, and built circuits with some of the first
commercially available transistors, then ICs, then uPs, but I have learned that
software is always cleaning up behind you, and if you can't keep ahead, it will
remove your economic reason to be. The only things I know you can do in
hardware that cannot be done algorithmically, are to generate a truly random
number, or to use quantum entanglement. But in the future, I expect these to be
common processor functions, so the software will just have to ask for them.

-Ken



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:11:43 MST