From: Michael Nielsen (mnielsen@tangelo.phys.unm.edu)
Date: Fri Jun 26 1998 - 20:18:50 MDT
On Fri, 26 Jun 1998, Dan Clemmensen wrote:
> We will have to do better than that just to keep up with historical
> trends in algorithm design. Most folks aren't aware of it, but
> algorithm design historically has improved performance as much as
> hardware improvements.
Algorithm advances are far more problem-dependant than hardware. If you're
talking about sorting an unordered list, for example, not much has
changed in a long while -- virtually all the advances can be attributed
to hardware improvement. On the other hand, the factoring of large
integers has come on by algorithmic leaps and bounds over the past 25
years; hardware improvement has been a relatively negligible factor in
the leaps and bounds taken in factoring.
There is an amusing inverse to note here. For sufficiently _hard_
problems, Moore's law makes essentially no difference. The obvious way
to look at this is that using Eratoshenes Sieve based factoring methods,
we'd only be able to factor numbers 10 or 15 bits longer than we could
back in the 1970s.
The inverse is that if we can factor 130 digit numbers now, using much
better (but still exponential) algorithms, then using the same algorithms
on 1970s computers, we would still be able to factor roughly the same
size numbers.
In other words, algorithmic advances are virtually _all_ that matters
for really hard problems. For easy problems, it's hardware advances
that count, at least, so long as Moore's law holds out.
Michael Nielsen
http://wwwcas.phys.unm.edu/~mnielsen/index.html
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:14 MST