[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [Condor-users] running a perl/script app using condor...

using the submit_config file i could have something like
universe = vanilla
executable = foo.pl
output = <<<<<<<< (not sure that i even need this as the perl
                  scripts generate their own output...)

the output statement is a file in which you want to redirect the stdout of your perl script. If your scripts never print to stdout but only create files, then you don't need it.

i'm trying to solve how to set this up, as each perl script i have generates
its' own output file, however, i don't know what the name of the  ouput file
will be prior to the file being run. i'd like to be able to capture all the
output files in the same location, separate from the input files.

so, my question is, how can i set this kind of environment up????

Your question is a bit unclear, but...

as i understand the docs, i can setup a nfs share to place my input/output
files. i can then direct using condor submit the appropriate input file to
be used from the nfs 'input' share, while at the same time directing condor
to place any output files in the nfs 'output' share.

  'nfs input share' - /nfs/input
  'nfs output share' - /nfs/output

The way Condor runs your job in an NFS environment is like this:

1) You cd to /blah/blim, which is in NFS.
2) You create a submit file, and any input files.
3) You submit the job.
4) The job runs in the same directory on the other computer.
5) You find your output files in the same directory.

If you don't have a shared file system, Condor can optionally transfer your files back and forth, but it will look the same: the output files will end up in a different directory.

If you want your files to end up elsewhere, there are a couple of ways to do it.

1) Change your perl scripts so that they can take an argument specifying a directory in which to place their output. Pass this directory to the script by using the "Arguments = " option in the submit file.

2) Use DAGMan to run two jobs in order:
  a) Job A: Run the perl script.
  b) Job B: Run another quick script in the scheduler universe that will
     move the files. The scheduler universe isn't well documented, but
     just specify "universe = scheduler", and your job will run immediately,
     will not use matchmaking, and will run on the computer from which
     you submitted your job.