Diane Trout wrote: ...
Other possibilities: * Is there an easy way in a condor_submit script to limit the number of simultaneous jobs? * Or can I query the load average of a different machine in my START expression? Or am I missing some better way of dealing with multi-gigabyte file format conversions?
The easiest is to submit your jobs as a dag with -maxjobs and -maxidle, as Ian suggested.
An uneasy way:Run rsync daemon on file server, put "max connections = some_number" in rsyncd.conf. Use rsync to copy input file from the server to execute node in PRE script in a loop: sleep for a second until rsync succeeds or the whole thing times out.
Same for copying output file back via POST script. Experiment with values of "some_number". Dima -- Dimitri Maziuk Programmer/sysadmin BioMagResBank, UW-Madison -- http://www.bmrb.wisc.edu