[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Condor-users] Using transfer_output_files to return very largeamounts of files.



I dont want condor to automatically generate process_$(process).out for every job.
from stdout. I want it to ignore stdout.

Chris
----- Original Message ----- From: "Matt Hope" <matthew.hope@xxxxxxxxx>
To: "Condor-Users Mail List" <condor-users@xxxxxxxxxxx>
Sent: Friday, December 09, 2005 9:08 AM
Subject: Re: [Condor-users] Using transfer_output_files to return very largeamounts of files.


On 12/8/05, Chris Miles <chrismiles@xxxxxxxxxxxxxxxx> wrote:
Each of the jobs im submitting are producing 100 output files or so at a
time.
So 100 jobs and you can see that being 1000 output files.

Is there anyway to return specify a wild card or something similar?

The reason I am doing this is my jobs run very quickly - a hundred jobs can
process
in 20 seconds on my powerfull cluster machines. and condor does not not
appear to
like this vast amount of jobs that run quickly.

Just allow condor to work it out itself and place any files you
*don't* want in sub directories and ones you do in the root of your
original working directory.

if you are using globus this isn't an option so you should prob
consider the alt plan below

Alternatively if you can consider making your job tar (or equivalent)
all the files together and compress them (this may make processing at
the other end a bit harder though so you may not want to spend the
effort doing this.

Matt

_______________________________________________
Condor-users mailing list
Condor-users@xxxxxxxxxxx
https://lists.cs.wisc.edu/mailman/listinfo/condor-users