Maybe someone can guide me here to find a solution for the following scenario:
I have a TASK, which I would like to convert it to be a Condor JOB.
The TASK do the following:
1. Using my own C++/Matlab execution file I'm generate a binary file - NAME A1.
2. The binary file A1 is now being process and compare against 16,000 files I previously created. (this is were I would like to take advantage of my cluster)
3. So file name A1 is now being compare with files foo, bar, zoo, tee etc......
4. There is no dependencies in the compare & analysis step, so if I have 16,000 slots I could run it all in one cycle.
5. Once the file A1 were being checked and analysed against all the 16,000 files, a new parameter file is created (param.xml) and also new binary file name A2.
6. The process now is starting again, this time with file name A2 and the new parameter file, until a new parameter file is created (param.xml) and new binary file name A3 is created.
7. Now The step is going over and over until I have file name A5.
8. Once A5 is being create I convert it again with some tool I have develop and I call the file results as file name B1.
9. I do the same with file name B1 as I did with file name A1, util having file name B5.
10. B5 is now being converted to file name C1.
11. C1 is now going via the same process of C2.....C5,......D1, D2.....D5, E1,E2..... until E5.
I was wonder how can I implement this via Condor? is DAG support some Loop section?
I saw this tool Makeflow = Make + Workflow but I'm not sure it will help me.