At this moment i'm submiting jobs to remote SGE in this way.
I've a Condor Job file which specifies Grid Universe, remote
cluster user and host etc...
Executable specified in this Condor file is a bash script that
starts a small workflow.
Problem is that all SGE parameters inside my bash script are
being ignored. And I'm in need of different SGE configuration
for each different Workflow (select different queue, slot
number, max memory, etc..)
Here's my bash script called by Condor job file:
## select env.var
## job name
#$ -N aln_bosco
## current work directory
## merge outputs
#$ -j y
## select all.q queue
#$ -q all.q
bwa aln /home/mastablasta/ref/hg19.fa /home/mastablasta/input/HapMap_2.fastq -t 8 > /home/mastablasta/output/tmp/HapMap.right.sai
If custom scripts are being echoed I don't understand why this
Are custom submit properties simple variables? I don't
understand the concept with the example in manual.
+remote_cerequirements = NumJobs == 100
Can i pass my script a custom submit property like this for SGE?
+remote_cerequirements = Queue = all.q