The Last Computer

From: Spudboy100@aol.com
Date: Wed Aug 30 2000 - 16:57:37 MDT


www.Newscientist.com

Here is an article that focuses on theorist, Seth Lloyd. A goodie article for
Omega Pointers, and Singularity Mavens.

                               The Last Computer
One day you'll be sitting in front of a searing-hot nuclear fireball instead
of that inert grey box, says Marcus Chown

SETH LLOYD has seen the future of computing, and it's bright. Blindingly
bright. For, according to Lloyd, the ultimate computer will be nothing like
an IBM ThinkPad and everything like a "billion-degree piece of the big bang".

Before you dismiss the idea, just consider the awesome power you would have
at your disposal. According to Lloyd's calculations, the ultimate laptop
could solve in less than a nano-second a calculation that would take any
state-of-the-art computer the age of the Universe to complete.

Admittedly, it might be a bit inconvenient putting a nuclear fireball on your
desk. But that is only the most ordinary, conventional kind of ultimate
computer--the alternative could be something stranger still.

The train of thought that led to these bizarre physical extremes started from
a simple observation. Gordon Moore, a co-founder of the computer chip maker
Intel, noticed in 1965 that the number of transistors per square inch on
chips had doubled every 18 months or so since the integrated circuit was
invented. This trend has continued. Experts are divided over how much longer
the law will hold, but Lloyd for one is fed up with people constantly writing
it off. "People have been claiming the law is about to break down every
decade since it was formulated," says Lloyd, a physicist based at MIT. "But
they've all been wrong. I thought, let's see where Moore's law has to stop
and can go no further. Let's find the limits that no amount of human
ingenuity will ever be able to get around."

To begin with, he wasn't too concerned with the details of how the ultimate
computer might work--those can be sorted out by the engineers of the future.
Instead, he stuck to considering basic physical quantities such as energy,
volume and temperature (Nature, vol 406, p 1047).

The speed of a computer, Lloyd realised, is limited by the total energy
available to it. The argument for this is rather subtle. A computer performs
a logical operation by flipping a "0" to a "1" or vice versa. But there is a
limit to how fast this can be done because of the need to change a physical
state representing a "0" to a state representing a "1". In the quantum world
any object, including a computer, is simply a packet of waves of various
frequencies all superimposed. Frequency is linked to energy by Planck's
constant, so if the wave packet has a wide range of energies, it is made up
of a large range of different frequencies. As these waves interfere with one
another, the overall amplitude can change very fast. On the other hand, a
small energy spread means a narrow range of frequencies, and much slower
changes in state.

Because a computer can't contain negative energies, the spread in energy of a
bit cannot be greater than its total energy. In 1998, Norman Margolus and Lev
Levitin of MIT calculated that the minimum time for a bit to flip is Planck's
constant divided by four times the energy.

Lloyd has built on Margolus's work by considering a hypothetical 1-kilogram
laptop. Then the maximum energy available is a quantity famously given by the
formula E = mc2. "If this mass-energy were turned into a form such as radiant
energy, you'd have 1017 joules in photons," says Lloyd. "And, if you put all
this energy in a single bit, it could flip in 10-51 seconds." But quantum
physicists believe the shortest possible time for any event to occur is the
"Planck time" of 10-43 seconds. "Something is screwy," says Lloyd.

In practice, computers do not have a single bit of memory but lots of bits.
If the energy of the 1-kilogram laptop were spread among more than a billion
bits, each would flip more slowly than the Planck time. "Although each bit
would flip more slowly, there would be more of them," says Lloyd, "so the
total number of bit-flips per second would be the same."

So the ultimate laptop, one that has converted all its mass-energy to
radiation, would be able to carry out a mind-boggling 1051 operations per
second. Compare this with today's standard laptop, which has a clock speed of
about 500 megahertz and carries out up to 1000 parallel operations each
cycle--a total of about 1012 operations per second. The ultimate laptop would
be 1039 times faster. If even that is too slow for you, you can add more
mass--a 1000-kilogram computer would be a thousand times faster, for example.
What's more, says Lloyd, the ultimate laptop would be a quantum computer,
able to exploit an unimaginable number of superimposed states, solving
certain kinds of problem (such as factorising large numbers) far faster than
a classical computer.

  
Photography: Ian Jackson
 
Why then are today's laptops so damned slow? The simple answer, says Lloyd,
is they use only the electromagnetic energy of electrons moving through
transistors, and this energy is dwarfed by the energy locked away in the mass
of the computer, which provides nothing more than the scaffolding to keep a
computer stable. The ultimate laptop would have all of its available energy
in processing interactions and none of its energy in dumb mass.

With that sort of computing power, astrophysicists could simulate the whole
Universe on the scale of stars, and physicists could simulate a large lump of
matter on the scale of individual atoms--and just think of the possibilities
for realistic computer games.

So much for speed. What limits memory? The short answer is entropy. This is
the degree of disorder, or randomness, in a system. Entropy is intimately
connected to information, because information needs disorder: a smooth,
ordered system has almost no information content.

State limits

Entropy is linked to the number of distinguishable states a system can have
by the equation inscribed on Boltzmann's headstone S = k ln W. Entropy (S) is
the natural logarithm of the number of states (W) multiplied by Boltzmann's
constant (k). Equally, to store a lot of information, you need a lot of
distinguishable states. To register one bit of information, you need two
states, one representing "on", the other "off". Similarly, 2 bits require 4
states, 3 bits 8 states, and so on. In short, both the entropy of a system
and its information content are proportional to the logarithm of the number
of states. "The answer to the question--what is the maximum memory of a
computer?--can actually be found on Boltzmann's headstone," says Lloyd.

So how much entropy does one of Lloyd's computers have? It depends on the
volume of the computer, as well as its energy. Broadly speaking, the more
volume, the more possible positions of particles in the computer, so the more
available states. For his ultimate laptop, Lloyd picked a convenient 1-litre
size.

The exact calculation also depends on how many different kinds of particle
are knocking around inside the computer. "If all the mass-energy of the
computer is used, we're talking about converting it into light," he says. "So
what we need to calculate is the number of distinguishable states available
to a box of light--this is a calculation carried out by Max Planck for a
so-called black body a century ago."

It turns out that a litre of light could store about 1031 bits, 1020 times as
much as a modern 10-gigabyte hard drive. Today's laptops store so little
information, says Lloyd, because they store it in an extremely redundant
fashion. A single bit on your hard drive is stored by a "magnetic domain"
which may contain millions of atoms.

All in all, the ultimate laptop would look pretty weird. With all that
radiant energy squeezed into such a small space, it would be fantastically
hot, around a billion degrees. The "light" would actually be high-energy
gamma-ray photons. "Controlling all that energy--that's the challenge," says
Lloyd.

Assuming we could contain this sizzling soup, it might work something like
this. Information would be stored in the positions and trajectories of
gamma-ray photons, and processed by collisions between these photons and the
few electrons and positrons also floating around.

 
Photography: Ian Jackson
 

Readout would be easy. "You simply open up a hole in the side of the box,"
says Lloyd. "The photons come out at the speed of light, and you record the
sequence of clicks on a gamma-ray detector." Input would require some sort of
controlled gamma-ray generator. Of course, all these accessories would take
useful mass-energy away from the central processor, but Lloyd assumes it will
eventually be possible to make them very small and light.

But whatever cunning technology is used for input and output, this version of
the ultimate laptop has a serious design flaw. Information can't be moved in
and out of the computer faster than light, so assuming that the 1-litre
laptop is a cube with 10-centimetre sides, all of its 1031 or so bits of
memory could be dumped in the time taken for light to travel 10
centimetres--3 10-10 seconds.

That gives a data transfer rate of nearly 1041 bits per second. But
potentially, the computer can perform a total of 1051 operations in that
second. The same goes for any substantial chunk of the computer--it can do
far more calculations than it can communicate to other parts of the computer.
So each subsection would have to work independently. "This is a highly
parallel machine," says Lloyd.

And this information bottleneck has serious implications for error
correction. Error-correcting codes check computer calculations to find out
whether something's gone wrong. But in the ultimate laptop, any erroneous
bits would have to be physically taken out of the computer, radiated away and
replaced by new bits. This version of the ultimate laptop can discard no more
than 1041 errors while it makes 1051 calculations, so it can tolerate only
one error for every ten billion operations. If that accuracy can't be
achieved, the 1-litre laptop would have to operate at below the ultimate
speed limit.

So is there any way to increase the input/output rate? "Yes," says Lloyd.
"Make the laptop smaller." If the size is reduced it takes less time for
information to move around, and there is less memory to be moved, so the
computer becomes more serial. In general, serial calculations are more
versatile, because highly parallel computers only work if the input and
output are brief, containing far less information than the total amount
processed in the course of the calculation.

The 1-litre ultimate laptop is already at about a billion degrees. But as it
is compressed the temperature rises and other, more exotic, particles can be
conjured into existence. "Computers of the future may be high-power
relativistic devices similar to particle accelerators," says Walter Simmons
of the University of Hawaii at Manoa. Simmons and his colleagues Sandip
Pakvasa and Xerxes Tata have explored the far future of so-called
relativistic computing, involving interactions between known physical
particles. But they have not pursued that future to the giddy limits
envisaged by Lloyd. "As the temperature rises and ever-more exotic particles
can be created, our knowledge of the physics gets shakier and shakier," he
says. Fortunately, though, there comes a time when the physics becomes simple
again.

If you keep squashing the computer, eventually it will turn into a black
hole. The whole mass of the 1-kilogram laptop would then be squeezed into a
volume little more than 10-27 metres across. How can this still be a
computer?

Stephen Hawking of the University of Cambridge theorised in 1970 that a black
hole should evaporate, emitting light and elementary particles from its
horizon, the surface that marks the point of no return for objects falling
in. Thermodynamics says that any radiating body has entropy, and in 1972
Jacob Bekenstein calculated how much a black hole must have. If we equate
entropy with information, this means that a 1-kilogram black hole can store
around 1016 bits.

According to conventional physics, as espoused by Hawking among others, any
information that goes into a black hole is lost to the rest of the Universe.
That would rule out using a black hole as a computer. But string theorists
think otherwise. "Hawking raised an important question," says Gordon Kane of
the University of Michigan, Ann Arbor. "But there is evidence that string
theory will show that something happens to preserve the information." Lloyd
believes that information about how a black hole was formed may be written on
the horizon, perhaps in the form of impressed strings, like flattened
spaghetti.

Because of this, Lloyd thinks it could be possible to use a black hole as the
ultimate computer. At this black hole limit, it turns out that the time
required to communicate around the hole is exactly the same as the time
needed to flip each bit. "In other words, the black-hole computer is the
ultimate serial computer," says Lloyd. He believes that this apparent
coincidence hints at another deep link between physics and information.

Ideas for how this black-hole computer would process information are even
vaguer than for the box of gamma rays. The input would be the initial state
of the material, the program would be how that material is forced to collapse
into a black hole. The output would be somehow encoded in the Hawking
radiation, emitted in a rapid blast as the hole evaporates. This is a one-off
computer, exploding with the answer to its calculation.

So here is where Moore's law must end, with a billion-degree laptop or an
exploding submicroscopic black hole. "The truth is we have no notion of how
to attain these ultimate limits," admits Lloyd. But don't despair--put your
faith in human ingenuity. If the rate of progress doesn't slow, we'll reach
these ultimate physical limits in just two hundred years' time.
 

 

>From New Scientist magazine, 02 September 2000.

------------------------------------------------------------------------------

--
Subscribe to New Scientist
 
  New Scientist Home _________________________  NEW SCIENTIST Contents page 
New Scientist Jobs Graduate CareersEditorial News Features Opinion Letters 
Feedback The Last Word Back Issues _________________________ WEB ONLY: 
Insight Special Reports Bizarre Science Science in the Bay Area Last Word Q & 
A Archive Keysites Science Books Artspace _________________________  Search 
the site _________________________  Subscribe  
© Copyright New Scientist, RBI Limited 2000
 
 


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:40 MST