> >resource ...
>
> Wei then asked:
> >Would you please explain?
>
> John Clark also responded:
> >I don't know what you mean by the "cost" of entropy. Entropy is free,
energy
> >is not, because energy is conserved, entropy is not. I don't want to
stop
> the
> >growth of entropy, it's possible that entropy will keep on increasing
> forever
> >and I certainly hope it does because the only alternative is the heat
death
> >of the universe. Free energy is related to entropy but if the universe
is
> >open then it's energy any intelligence must be very stingy with if it
wants
> >to survive for long.
> >...
> >With reversible computing, ... Landauer, Bennett and Merkle have shown
that
> >the amount of energy needed to make a calculation can be made
arbitrarily
> >small by slowing down the calculation a little.
>
> Wei posed the problem of cooling to get rid of excess local heat, by
making
> contact with distant sources of negentropy (= max sys entropy - actual
> entropy)
> such as the cosmic background. For this problem, the conservation of
energy
> constraint is much less important than the constraint that total entropy
> cannot
> decrease. In making my contribution to reversible computing
> (http://hanson.berkeley.edu/reverse.html) I learned enough say with great
> confidence that there is no particular advantage to erasing bits at lower
> temperatures. If it were otherwise you could make a perpetual motion
machine:
> erase bits (= replace unknown bits with known bits) at low temps and then
> reverse the operation (replace known bits with unknown bits) at high
temps.
> By the "costs less energy" intution this cycle would create available
energy.
>
>
> Robin Hanson
> hanson@econ.berkeley.edu http://hanson.berkeley.edu/
> RWJF Health Policy Scholar, Sch. of Public Health 510-643-1884
> 140 Warren Hall, UC Berkeley, CA 94720-7360 FAX: 510-643-8614