[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Condor-users] Does stork avoid retransfering data?



On Tue, 11 Jan 2011, Rowe, Thomas wrote:

I have to run a simulation about a thousand times with different seeds. The simulation executable and data total about 100MB. This sounds like a job for DAGMan & Stork, because this 100MB collection of files needs to get copied around reliably, and some large output files need to be transferred back to the originating machine reliably.

My question: Does Stork and/or DAGMan do anything intelligent about avoiding recopying files? The input files are identical for all thousand runs; only the seed varies. But I would like to have Condor manage each run individually. So does all the data and the executable get copied around a thousand times, cleaned up after each run? If the thousand reps are child to the Stork job that transfers files in place, does everything just work with no extraneous recopying of input data?

DAGMan itself doesn't do anything special to handle the data transfers -- if you have a DAG with 1000 jobs, it's basically as if you manually submitted the 1000 jobs in the right sequence.

Stork is no longer supported by the Condor team. If nobody on this list is able to give you an answer, you might want to contact the Stork people at LSU:

  http://stork.cct.lsu.edu/

Kent wenger
Condor Team