[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Condor-users] 6.8.0 and NFS Problem



Hi,

Can I ask why this is, by the way? "File locking" implies that multiple programs are reading/writing the same file, and some access control is needed. What if I have one condor_dagman job instance for each dag log file? I'm specifically spawning jobs (dag or not dag) that don't write to the same logfiles (i.e. only have one "Queue" statement) and am telling my users to do the same.

I second the request to have a feature that disables the warning message. While using Condor and DAGs, I've submitted hundreds of jobs that have several hierarchical dags and several tens of thousands of sub-job instances, and have never had a problem due to my log files being on NFS. And some of my colleagues don't really like seeing new warning messages all over the place. :)

 - Armen

R. Kent Wenger wrote:
Be sure not to put log files for any DAG node jobs on NFS, though.

Kent Wenger
Condor Team

------------------------------------------------------------------------
--
Armen Babikyan
MIT Lincoln Laboratory
armenb@xxxxxxxxxx . 781-981-1796