[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Condor-users] Can a job send a trigger to let other jobs start?



Hi all,

I've got a problem here, which I don't know how to tackle and what to advise 
the user to do:

The jobs (usually 2000-4000) are started via dagman and read a lot of data 
initially (about 2-3 GByte per jobs). After that they crunch through the 
loaded data for a couple of hours. This initial start-up phase is quite a lot 
of load on the central data server, thus we would like to have a handle to 
limit this.

With dagman's maxjobs feature this could be solved, however this would only 
start new jobs after the first batch of jobs is done. 

Thus my question is, is there a way to limit the initial number of jobs and 
send a "trigger" to dagman to start more jobs, once jobs are done with loading 
their data sets.

Cheers

Carsten