Small Talk: How small can you get?

From: J. R. Molloy (jr@shasta.com)
Date: Sun Apr 01 2001 - 11:54:43 MDT


Are there really hard limits to small-scale structure? --J. R.

----- Original Message -----
From: "Robert J. Bradbury" <bradbury@aeiveos.com>
To: <extropians@extropy.org>
Sent: Thursday, March 15, 2001 3:48 PM
> Given my dislike of anything that smells like magic physics that
> we do not have known pre-existing examples of engineering of that
> type, I will suggest exploring that realm as a useful exercise
> until all the other possible approaches have been exhausted.
> [I will note that Anders usually confines himself to the known realm
> as well, but does from time to time stray into speculations in these
> areas... But I suppose this is to be expected from individuals who
> spend their spare time developing role-playing games...]

PROGRAMMABLE BLACK HOLE COMPUTERS. One usually
thinks of a black hole as an omnivorous object swallowing energy
and spitting some of it back in the form of Hawking evaporation
radiation, consisting of particles created in pairs out of the vacuum
near the edge of the black hole. In principle, a tiny black hole can
be formed in a way that encodes instructions for performing
calculations. Correspondingly, the answers could be read out from
the escaping Hawking radiation. Why use a black hole at all?
Because of the presumed tremendous density of information and
potential processing speed implicit in the extreme black hole
environment. Seth Lloyd of MIT has previously addressed himself
to calculating the conceivable limits on the computing power of
such a black hole computer (Nature, 31 August 2000) and arrives
at a maximum processing speed of about 10^51 operations/sec for
a 1-kg black hole. Now Jack Ng of the University of North
Carolina yjng@physics.unc.edu, 919-962-7208) extends this study
by asking whether the very foaminess of spacetime, thought to
arise at the level of 10^-35 m, provides an alternative way to limit
theoretical computation. Ng not only finds that it does but that the
foaminess of spacetime leads to an uncertainty in timekeeping (the
more accurate the clock, the shorter its
lifetime) which in turn leads to a bound on information processing
(speed and memory simultaneously) analogous to the Heisenberg
bound on simultaneous measurement of momentum and position.
These limits are so generous that they normally pose little problem
for ordinary physical measurements, but in the case of black hole
computer the limits would apply immediately. Ng adds, apropos
of detecting gravity waves with LIGO and other interferometric
devices, that in addition to accounting for various forms of noise,
such as seismic disturbances or thermal noise in the detectors, the
faint gurgle of spacetime foam will eventually have to be included
as an additional and unavoidable source of noise in the
measurement of very short displacements (the movement of
mirrors owing to the flexings of spacetime brought about by
passing gravity waves). If Ng is right, the noise sensitivity
achievable by the prospective advanced phase of LIGO will only
need a further hundredfold enhancement in order to detect the
quantum foam, which is to say the very fabric of spacetime. Thus
the Planck scale, so far only a hypothetical extreme regime, might
eventually become a realm that can be approached and measured.
(Physical Review Letters, 2 April 2001.)

> > No matter how small we imagine units of space can get, they
> > can always get smaller. NT is just the beginning of the universe
> > of the infinitesimal.
>
> Nay, I must be the Knight who says Ni! As Anders documents in
> his paper (see below), the Bekenstein (Bremermann) bounds set
> hard limits on how small you can get. It is however *very* small.
> Refs:
> http://hypertextbook.com/facts/MichaelPhillip-JudyDong.shtml
> http://hypertextbook.com/facts/YelenaMeskina.shtml
> http://www.lesk.com/mlesk/ksg97/ksg.html
> http://www.transhumanist.com/volume5/Brains2.pdf
> http://www.aeiveos.com/~bradbury/Authors/Computing/Moravec-H/index.html
> http://www.aeiveos.com/~bradbury/Authors/Computing/Bekenstein-JD/index.html
> http://www.aeiveos.com/~bradbury/Authors/Computing/Bremermann-HJ/index.html
>
http://www.phy.duke.edu/Courses/211/misc/lloyd-ultimate-physical-limits-of-com
putation.pdf

Hard limits on how small you can get?
http://www.hp.com/cgi-bin/ghp/go.pl?http://nano.xerox.com/nanotech/feynman.htm
l
http://slashdot.org/articles/99/09/03/0845221.shtml
http://www.zyvex.com/nanotech/feynman.html
http://www.ul.ie/~childsp/CinA/Issue38/editorial38.htm#howsmall
Most of us are familiar with the SI units kilo (k), milli (m), micro (æ) etc.
But some of the prefixes for smaller quantities are unfamiliar. As analytical
techniques have become more sensitive so we can measure less and less in more
and more. The latest prefixes to be assigned are zepto (z) for 10-21 and yocto
(y) for 10-24. "Professor Dovichi, a leader in ultra microanalytical
techniques, jokingly suggested that these units have been named in honour of
the lesser-known Marx brothers. Indeed, the baptismal habits of the
International Committee of Weights and Measures are mysterious, as the
bastardized Scandinavian femto- (f) and attom- (a) had already proved. Joke
aside, the need to for having units as small as 10-21 and 10-24 is a serious
and dramatic development. After all, a yoctomole is less than a single
molecule. It makes sense only as a probability in a quantum mechanical
context."
Gabor B. Levy



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:47 MST