[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Condor-users] stopping a dag??

On Wed, Dec 15, 2004 at 04:58:31PM -0600, Alan De Smet wrote:
> Michael Remijan <remijan@xxxxxxxxxxxxx> wrote:
> > If I have some of the jobs in a dag running, is there any way to tell 
> > condor to finish those jobs but not start any new jobs from the dag?
> Not explicitly.  There are probably better workarounds, but I'm
> confident this will work: Replace the all of your individual job
> submit files with a minimal one that will finish almost
> instantly.  Something like the following should work:
> executable=/bin/true
> 	# (Or false if you want DAGMan to consider the jobs to have failed)
> universe=scheduler
> queue
> When DAGMan tries to submit the new jobs it will actually submit
> the replaced versions which will return almost immediately.

Or, in 6.7.1 or later, you can add

noop_job = true

to your submit file and Condor will do everything but actually run
your job when you submit it (the submit and terminate events are present)

You could also send a SIGSTP to the DAGMan process running your job, which
woudl prevent it from submitting any new jobs. One problem that you'd 
have is that the schedd will eventually kill your DAGMan when it stops
reporting that it's alive. (You can hack around this)

Otherwise, there's really no way to control the flow through the DAG after
DAGMan has started.