[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Condor-users] Standard universe question

I'm trying to run a standard universe job using a startup script (i.e. allow_startup_script = True in the submit file) as well as the local_files option for this universe, but not getting much joy. The startup script's task is to pull over a large compressed directory containing input data before the main job starts, and uncompressing it before it exec's out, which I can see it doing successfully in the scratch space on the execute host. However, the main application then fails to find any files in the freshly uncompressed subdir, even though I'm pretty sure I've obeyed the syntax mentioned in the 7.2 manual. Are these two options mutually incompatible?

If it's any help, here are the salient entries for the submit script:

  Universe             = standard
  Executable           = wrapper.sh
  allow_startup_script = True
  local_files          = ./weatherData/*

and here's the wrapper script:


# Pull over the compressed input data and executable
wget http://my.web.server/input_data.tgz
tar zxf input_data.tgz
chmod a+r+x ./weather
chmod a+r ./weather/*
rm input_data.tgz
exec ./condor_executable  ${1+"$@"}


That bundle input_data.tgz contains both the real executable "condor_executable" as well as the tarred directory "weather" (which contains all the data files). condor_executable starts running just fine, and can access files on the submit machine like a well behaved standard universe job, but it's those files in the local scratch space that it can't find (the application reports that it can't read ./weather, though if I leave that directory on the submit host and don't nominate it via local_files then it works fine). I'm using 7.2.5 under Debian stable.

Any help would be appreciated chaps.


The Cavendish Laboratory, University of Cambridge,
J J Thomson Avenue, Cambridge, CB3 0HE, UK
Tel. (+44/0) 1223 746627