[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Condor-users] Submission of large numbers of jobs



If one has a large number of jobs to submit - say 100,000 jobs - what is the
recommended way of doing this?  Can simply submitting that number of jobs
cause problems?  These jobs will take their input from a shared filesystem.

>From what I've read, each job will take ~10KB of RAM in schedd, so 100K jobs
would be about 1G of RAM just for the job queue.  If I can afford that, is
there anything else to worry about?

I note that dagman has the ability to submit only a limited number of jobs
at a time (maxjobs, maxidle):
http://research.cs.wisc.edu/condor/manual/v7.8/2_10DAGMan_Applications.html#SECTION003107400000000000000

Presumably this is by drip-feeding its own list of jobs in the DAG into the
main condor queue.  So would it be better to create a DAG with 100K nodes
(and no parent-child relationships) and submit the jobs that way?

Thanks,

Brian.