And I recall CarlF’s fairly recent comments about problems with the
spiraling costs of silicon circuitry fabrication facilities. However,
the cover of the current issue (971006) of Forbes magazine roars about
"Life at 100,000,000,000 Bits Per Second" and "Photonics is the next
stage of the digital revolution."
Now, I don’t find the concept of technological Singularities very
useful (except in the sense that Anders Sandberg recently defined
them); but, Howard Banks’ Forbes article, "The Law of the Photon"
starts off as follows: "Moore’s Law said that chip power would double
every 18 months. That’s plodding. The new law of the photon says that
bandwidth triples every year" and continues: "Making all this possible
is photonics . . . There is no official name for the law that says how
fast this science will carry us into the next century. We could,
however, call it Payne’s Law, in honor of David Payne, a 53-year-old
physicist at Britain’s University of Southampton. Payne is perhaps the
leading scientist behind two key inventions in photonics over the past
decade and a half." These are listed as the optical fiber amplifier
and wave-division multiplexing.
Hal concluded:
< But I am always suspicious of that 20-year prediction horizon. We
can guess what will happen technologically in the next ten years, but
beyond 20 we really have no idea. "Here there be dragons," and we are
inclined to put our wonders safely in the 20-30 year period. In
practice though things often take much longer than we expect. >
And Brian Atkins replied:
< For anything related to software engineering (AI), I agree with you.
Unless we can come up with ways to reduce the complexity of software
development, things there will get slower and slower. Hardware on the
other hand should continue to grow for at least another 20 years even
with the technologies we have today. The question is: does
super-hardware without AI software bring about a singularity? Or will
the super-hardware make the effort of programming an AI possible? >
I have long felt that communication is just as important as
computation to the development of AI / SI / Uploads / Nanotech. I
don’t expect a capital-S Singularity, but I do expect things to get
VERY interesting at an increasing rate of change.. While our
computational capabilities have been expanding according to Moore’s
Law, bandwidth, and the human ability to utilize it, have not been
keeping pace. The Web, after all, is only a few years old. One
promising approach to finally achieving ‘real’ AI is the recent
emphasis on agent-based computing along with even more recent attempts
to create software for the emerging Global Brain (c.f.,
http://www.cpm.mmu.ac.uk/~majordomo/gbrain )
Mark Crosby
_____________________________________________________________________
Sent by RocketMail. Get your free e-mail at http://www.rocketmail.com