From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Wed Dec 05 2001 - 18:59:20 MST
On Tue, 4 Dec 2001, Smigrodzki, Rafal wrote:
> ### I had some problems with Folding@home - the machine started warning
> about low virtual memory and crashed. I am not authorized to increase the
> paging file (now it's at 70MB), so I had to uninstall it. I still have it
> running on a couple of other computers (it's the kaufneuro7 team member) but
> I wonder if the program could put too much of a load on the LAN. Any advice?
Rafal -- you should seriously talk to whomever the system managers are
about increasing the paging file size. I assume you are running NT.
If so, that is way too small a size for any real work. Its probably
too small for even '95 or '98.
I can get my Netscape process size under NT up to 50-60 MB on the
task manager (if I run it for a week or so and work with 40-60
windows). IE will push up into the 10-30 MB range, Acrobat can
be a pig at times as well. Add in all the other processes and
shared libraries and you need to be up around 128-256 MB of paging
space for "real work" (yea, I remember dose days when weal men
could do weal work in 64KB but da people writing code now-a-days
are *soooo* slappy). I bumped my machines to 256MB of main memory
and 512MB of paging space long ago when I kept hitting a limit
that prevented new windows from being opened. Turned out its
a stupid hardwired limit in NT that Microsoft probably won't
fix for a year or two -- but I digress.
It looks to me like under NT, its taking 4-5 MB of memory while
on Linux its more like 6-8 MB, perhaps due to a lack of accounting
for dynamically loaded libraries under NT (I've been told the NT
task manager sizes do not accurately reflect the real process size).
I can watch my LAN activity and it looks like that isn't an issue.
I had assumed that doing this would require a lot of net traffic
but apparently that isn't the way it works. They seem to
download a work unit (perhaps its a small fragment of the protein)
do a lot of computing on it, checkpointing as they go. Perhaps
this is just modeling the folding of that fragment. Then when
the unit is done (after 1-6 days), they load the results back up
to Stanford and fetch the next Work Unit. Very little net overhead
at all. I think they periodically retry net access to download or
upload Work Units so it actually works with the machines offline
from the net. It may be they have a larger "overlord" program
at Stanford that stiches individual Work Units back together to
get a complete picture.
[At least that is my impression so far.]
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:12:22 MST