[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [HTCondor-users] Specifying log files from python



I Âam just getting started with condor. I do not have any submit files nor any legacy system. I am finding it very difficult to get going. I have been messing with it for 2 weeks and have yet to be able to successfully submit a job.Â



On Wed, Dec 27, 2017 at 1:26 PM Todd Tannenbaum <tannenba@xxxxxxxxxxx> wrote:
If you are using HTCondor v8.6.6 or above, I recommend using the htcondor.Submit() class to submit jobs instead of the Schedd() class. With the Submit class, you do not have to convert your submit file into a classad - the Submit class will do that for you. So if you have a condor_submit file that works, you are likely good to go. Much easier than trying to use the more primitive submit method in the Schedd class in my opinion. Details on the Submit class are in the Manual, and there is an example submitting a job with the Submit class in this ticket:
https://htcondor-wiki.cs.wisc.edu/index.cgi/tktview?tn=6420

Hope this helps
Todd

Sent from my iPhone

On Dec 27, 2017, at 11:59 AM, Larry Martell <larry.martell@xxxxxxxxx> wrote:

On this site: http://osgtech.blogspot.com/2014/03/submitting-jobs-to-htcondor-using-python.html
I see this:

For example, consider the following submit file:

executable = test.sharguments = foo bar
log = test.log
output = test.out.$(Process)
error = test.err
transfer_output_files = output
should_transfer_files = yes
queue 1

The equivalent submit ClassAd is:

[
ÂÂÂCmd = "test.sh";
ÂÂÂArguments = "foo bar"
ÂÂÂUserLog = "test.log";
ÂÂÂOut = strcat("test.out",ProcId);
ÂÂÂErr = "test.err";
ÂÂÂTransferOutput = "output";
ÂÂÂShouldTransferFiles = "YES";
]

So in my python code I create this logging in my ClassAd:

[
ÂÂÂÂÂÂÂErr = "strcat(\"/Staging/Repos/CAPbase/cluster/logs/compute_radiology.err\",
ProcId)";
ÂÂÂÂÂÂÂOut = "strcat(\"/Staging/Repos/CAPbase/cluster/logs/compute_radiology.out\",
ProcId)";
ÂÂÂÂÂÂÂUserLog =
"strcat(\"/Staging/Repos/CAPbase/cluster/logs/compute_radiology.log\",
ProcId)";
]

But I see this in the SchedLog:

12/27/17 12:34:15 (pid:3755290) WriteUserLog::initialize:
safe_open_wrapper("/opt/capcompute/util/strcat("/Staging/Repos/CAPbase/cluster/logs/compute_radiology.log",
ProcId)") failed - errno 2 (No such file or directory)

/opt/capcompute/util/ is the dir the python script that is submitting
the job is running from.

What am I doing wrong here? How do I properly specify the path and
file name for the logs?