[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Condor-users] Re: split processors among multiple jobs
- Date: Tue, 15 Mar 2005 19:29:28 +0000
- From: Matt Hope <matthew.hope@xxxxxxxxx>
- Subject: [Condor-users] Re: split processors among multiple jobs
On Tue, 15 Mar 2005 07:48:10 -0800, Oliver Hotz <oliver@xxxxxxxxxxxx> wrote:
> is there any way, that if i submit 2 jobs at the same time (each with 30
> queues), so 60 jobs total, and i have 30 processors, that instead of
> processing job 1, wait till it finishes and then processes job 2, i want
> it to split the 30 processors equally and give 15 processors to job 1,
> and 15 processors to job 2 right from the start ?
> any way to do this ?
Ian's answers are all valid and most are better long term solutions,
however in the spirit of fast and dirty...
Another option that is fast and easy to achieve if a pain to do every time is:
1) submit all jobs on hold from cluster A and cluster B with priority 0
2) edit the job priorities of both clusters so that even or odd
numbered jobs have a higher priority
3) release your jobs
this should cause them to be interleaved.
This will not work well with other jobs in the queue (you would have
to recurse the above instructions and would eventually run out of
priority divisions as currently you have only -20 -> +20)
That said this is going to get the simple case working immediately