[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Condor-users] [Globus-discuss] Problem in Job Submission to Condor through GT4



Sentil,

Here are a couple things to try to debug this...

What is the globusrun-ws command used to submit the job? Maybe try doing a single job submission instead of a multi-job submission. Perhaps there is a bug with multi-jobs, could be globusrun-ws, could be the ManagedMultiJobService (MMJS), ...

Does a similar multi-job and getting output work when targeting Fork instead of Condor?

Here is a link to debugging script executions:
http://www-unix.globus.org/toolkit/docs/4.0/execution/wsgram/ developer-index.html#id2852800
With that you should be able to get the condor job submission script that is created by MEJS. Compare the gram created job submission script to the one used when submitting the job directly to condor, e.g. without globus/gram involved.


-Stu

On Oct 12, 2005, at 9:20 AM, Natarajan, Senthil wrote:

Peter,
I ran ./setup-globus-job-manager-condor, it updated the condor.pm.
But still the output file is not copied back to the submitted machine.

I was wondering is this problem is happening only with GT4, or even the
older version of Globus has it. Then how the people are really
submitting the Jobs to Globus to run on Condor pool?
Is that I am the only person having this problem?
Is there any way we can get this done, all I am trying is I have a
simple shell script it sleeps and prints counter value in the output
file. I want that output file copied back, from the executing machine to
job submitted machine.(condor does this if I directly submit to condor
pool but not through Globus)


Here I am attaching the
1) condor.pm.
2) Shell Script (My test job)
3) XML Job description
4) logfile dump
5) condor job description file (analogous to the Globus XML Job
description)
6) output file got when I ran the same job by directly submitting to the
condor pool.


Could any one please help me on this.
Thanks,
Senthil


-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Tuesday, October 11, 2005 6:13 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job Submission to Condor through GT4

On Tue, 2005-10-11 at 17:18 -0400, Natarajan, Senthil wrote:

Peter,
Any way I rebuild all the 4 packages with the force option but still
having the same problem. I noticed (update time) the
$GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm was not
modified by any of these package rebuild still showing the old date

and

time.


I looked over the post install instructions that I gave you and noticed
I block copied the wrong setup script from my directory listing, sorry
about that. It should have been setup-globus-job-manager-condor.
Here's a dump of what I just did to verify that the condor.pm was being
updated:


% ls -l $GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm
-rw-r--r-- 1 lane lane 17K Oct 10
12:05
/usr/local/globus/globus-4.0.1/lib/perl/Globus/GRAM/JobManager/ condor.pm
% cd $GLOBUS_LOCATION/setup/globus/
% ./setup-globus-job-manager-condor
checking for condor_submit... /opt/condor-6.7.6/bin/condor_submit
checking for condor_rm... /opt/condor-6.7.6/bin/condor_rm
find-condor-tools: creating ./config.status
config.status: creating condor.pm
config.status: creating globus-condor-print-config
% ls -l $GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm
-rw-r--r-- 1 lane lane 17K Oct 11
16:06
/usr/local/globus/globus-4.0.1/lib/perl/Globus/GRAM/JobManager/ condor.pm


Peter



Please find the attached condor.pm (after all 4 packages forceful rebuild) and the logfile dump.

Here is package build.

[globus@gis14 globus-4.0.1]$ gpt-build --force
globus_wsrf_gram_service_java-0.78.tar.gz gcc32dbg
gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_wsrf_gram_service_java
gpt-build ====> Changing to


/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/
gpt-build ====> BUILDING globus_wsrf_gram_service_java
gpt-build ====> Changing to

/usr/local/GridComputing/globus-4.0.1/BUILD

gpt-build ====> REMOVING empty package
globus_wsrf_gram_service_java-noflavor-dev
gpt-build ====> REMOVING empty package
globus_wsrf_gram_service_java-noflavor-rtl
[globus@gis14 globus-4.0.1]$ gpt-build --force
globus_scheduler_event_generator-1.0.tar.gz gcc32dbg
gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_scheduler_event_generator
gpt-build ====> Changing to


/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_scheduler_event_gener

ator-1.0/
gpt-build ====> BUILDING FLAVOR gcc32dbg
gpt-build ====> Changing to

/usr/local/GridComputing/globus-4.0.1/BUILD

gpt-build ====> REMOVING empty package
globus_scheduler_event_generator-gcc32dbg-pgm_static
gpt-build ====> REMOVING empty package
globus_scheduler_event_generator-noflavor-data
gpt-build ====> REMOVING empty package
globus_scheduler_event_generator-noflavor-doc
[globus@gis14 globus-4.0.1]$ gpt-build --force
globus_gram_job_manager-7.7.tar.gz gcc32dbg
gpt-build ====> CHECKING BUILD DEPENDENCIES FOR

globus_gram_job_manager

gpt-build ====> Changing to


/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager-7.7/

gpt-build ====> BUILDING FLAVOR gcc32dbg
gpt-build ====> Changing to

/usr/local/GridComputing/globus-4.0.1/BUILD

gpt-build ====> REMOVING empty package
globus_gram_job_manager-gcc32dbg-dev
gpt-build ====> REMOVING empty package
globus_gram_job_manager-gcc32dbg-pgm_static
gpt-build ====> REMOVING empty package
globus_gram_job_manager-gcc32dbg-rtl
[globus@gis14 globus-4.0.1]$ gpt-build --force
globus_gram_job_manager_setup_condor-2.9.tar.gz gcc32dbg
gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_gram_job_manager_setup_condor
gpt-build ====> Changing to


/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.9/
gpt-build ====> BUILDING globus_gram_job_manager_setup_condor
gpt-build ====> Changing to

/usr/local/GridComputing/globus-4.0.1/BUILD

gpt-build ====> REMOVING empty package
globus_gram_job_manager_setup_condor-noflavor-data
gpt-build ====> REMOVING empty package
globus_gram_job_manager_setup_condor-noflavor-dev
gpt-build ====> REMOVING empty package
globus_gram_job_manager_setup_condor-noflavor-doc
gpt-build ====> REMOVING empty package
globus_gram_job_manager_setup_condor-noflavor-pgm_static
gpt-build ====> REMOVING empty package
globus_gram_job_manager_setup_condor-noflavor-rtl





-----Original Message-----
From: Peter G Lane [mailto:lane@xxxxxxxxxxx]
Sent: Tuesday, October 11, 2005 3:34 PM
To: Natarajan, Senthil
Cc: discuss@xxxxxxxxxx
Subject: RE: [Globus-discuss] Problem in Job Submission to Condor
through GT4

I think the other ones are good.  The extension handler seems to

working

appropriately.  All you need is the
globus_gram_job_manager_setup_condor-2.9 package to interpret the

values

that the extension handler is setting up for you.

Peter

On Tue, 2005-10-11 at 14:39 -0400, Natarajan, Senthil wrote:

Peter,
You mean to install those 4 packages only right with the --force

option?

Thanks,
Senthil

-----Original Message-----
From: Peter G Lane [mailto:lane@xxxxxxxxxxx]
Sent: Tuesday, October 11, 2005 2:36 PM
To: Natarajan, Senthil
Cc: discuss@xxxxxxxxxx
Subject: RE: [Globus-discuss] Problem in Job Submission to Condor
through GT4

Yeah, as I expected the code to put those directives in the condor

job

description are missing.  I also see your custom additions, so you
probably just need to force a rebuild.  Try gpt-build again with the
--force option followed by the instructions I gave you last time for
redoing the post install steps.  If that still doesn't work, capture
your session trying to build it and send it back to the list.  For

some

reason that condor update package isn't being installed.

Peter

On Tue, 2005-10-11 at 09:34 -0400, Natarajan, Senthil wrote:

Peter,
Here I attached the condor.pm.
I just followed your instruction to install those 4 packages
(using gpt-build). Please let me know whether it has been properly
installed or what needs to be done.
Thanks,
Senthil


-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Monday, October 10, 2005 4:22 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job Submission to Condor through GT4

On Mon, 2005-10-10 at 15:26 -0400, Natarajan, Senthil wrote:

Peter,
I ran. /setup-globus-scheduler-provider-condor
Then I corrected the typo WhenToTransferFiles to

WhenToTransferOutput.

Then I submitted the job, but still it is not copying the output

file

back to the submitted machine. I guess since condor is running

the

job

in some tmp execute directory in some other machine, globus

don't

know

about that (some kind of communication is missing between condor

and

globus) machine and directory, so it is not copying back.

When you tried did your job ran in your job submitted machine

itself

or

in other machine?


I only have a local, single-machine installation of Condor that I

test

with.  All I do is make sure the correct directives are in the

Condor

job description.  If I can verify that the directives are being

appended

and it doesn't work, then it's a Condor issue.  I know the

directives

are being written because Condor complains if I submit your job
description with the typo (i.e. it complains that it has
should_transfer_files but not WhenToTransferOutput.  Fixing the

typo

prevents Condor from complaining).



With out running the setup-globus-scheduler-provider-condor and modifying WhenToTransferFiles to WhenToTransferOutput, I ran

the

job

on

Friday , It was copying the file back if the job is running in

the

same

machine as the job submitted machine.

I was curious did you test on the same machine (job submission

and

execution in the same machine) or you submitted the job in one

machine

and it ran on different machine?

Please let me know.
Here I am again attaching the logfile dump and the XML Job

description.

Thanks,
Senthil


Note: In order to run the condor job, I added these lines. In $GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm (in

line

280)

print SCRIPT_FILE "should_transfer_files = IF_NEEDED\n";
print SCRIPT_FILE "WhenToTransferOutput = ON_EXIT\n";
print SCRIPT_FILE "transfer_files = ONEXIT\n";


Like I said before, this is exactly what I add if the updated

condor

package is installed properly.  Can you send me the following

file:


$GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm

I really don't think the updated condor.pm got installed.

Everything

else looks fine.  I'm just not seeing output from the condor.pm

module

that indicates that it is putting these lines in the Condor job
description.  You should be seeing something likes this:

Mon Oct 10 12:18:22 2005 JM_SCRIPT: Using jm supplied job
dir: /home/lane/.globus/3cf6d2c0-39ba-11da-99c3-9071bceca9ea
Mon Oct 10 12:18:22 2005 JM_SCRIPT: Adding "should_transfer_files

=

YES"

Mon Oct 10 12:18:22 2005 JM_SCRIPT: Adding "WhenToTransferOutput =
ON_EXIT"
 Mon Oct 10 12:18:22 2005 JM_SCRIPT: Adding "transfer_output_file

=

stdout.j3, stderr.j3"  Mon Oct 10 12:18:22 2005 JM_SCRIPT: About

to

submit condor job

That's straight out of my logfile after I installed the same

package

that is on the web site and ran your corrected job description.

Peter



If I remove these line still the jobs are not running.(i.e. it

is

not

finding the HasTranferFile attribute in the requirement

expression)




-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Monday, October 10, 2005 2:21 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job Submission to

Condor

through GT4

I think the problem is that I neglected to tell you to re-run

gpt-

postinstall with the --force option.  You should also be able to

cd

to

$GLOBUS_LOCATION/setup/globus and run './setup-globus-scheduler-
provider-condor'.  The later is much quicker.

Also, WhenToTransferFiles should be WhenToTransferOutput.

Fixing

this

typo in your job description and changing it for my system ran

just

fine

for me.

Peter

On Mon, 2005-10-10 at 13:08 -0400, Natarajan, Senthil wrote:

Peter,
Here I attached the logfile dump and my XML job description.
Still it is not copying the output file back to the job

submitted

machine if the job runs on different machine. But if the job

runs

on

the

same machine it copies the output file.
Please let me know.

Sorry, I couldn't get back to you. Actually I was out for more

than

couple of days to attend the Grid Conference in Boston; I did

meet

your

colleagues.
Thanks,
Senthil



-----Original Message-----
From: Peter G Lane [mailto:lane@xxxxxxxxxxx]
Sent: Thursday, September 29, 2005 3:53 PM
To: Natarajan, Senthil
Cc: discuss@xxxxxxxxxx
Subject: RE: [Globus-discuss] Problem in Job Submission to

Condor

through GT4

On Thu, 2005-09-29 at 14:26 -0400, Natarajan, Senthil wrote:

Peter,
I changed the

$GLOBUS_LOCATION/lib/perl/Globus/GRAM/ExtensionsHandler.pm

file according to your suggestion and I didn't get that

parser

error.

I

submitted the job the job successfully executed but it

didn't

copy

the

output file back to the submitted machine (i.e our main

problem

and

we

are working on to solve that).

I am pretty sure that what ever we are specifying under the

extensions

tag, globus is not recognizing it or it is not doing

anything

with

that.

Because....

i) Initially I had problem in executing the job itself,

(i.e.)

as

soon

as I submitted the job to globus, globus in turn submits the

job

to

the

condor pool. And the jobs are idle and not running bcos of

the

job

requirement.
Requirements = (OpSys == "LINUX" && Arch == "INTEL") &&

(Disk

=

DiskUsage) && ((Memory * 1024) >= ImageSize) &&

(TARGET.FileSystemDomain

== MY.FileSystemDomain)
The problem here is, there is no HasFileTransfer option in

the

requirement expression. In order to fix this I added the

following

two

lines in the file
$GLOBUS_LOCATION/lib/perl/Globus/GRAM/JobManager/condor.pm
print SCRIPT_FILE "should_transfer_files = IF_NEEDED\n";
print SCRIPT_FILE "WhenToTransferOutput = ON_EXIT\n";
This fixed the job execution problem, but not copying the

file

back

to

the submitted machine.


This is exactly what I added to the Condor job description

based

on

the

extensions.  What you can do is add an extension element

before

all

the

other extension elements to write a log file which dumps some

useful

debugging info:

<extensions>
    <logfile>/home/me/logfile</logfile>
    . . .
</extensions>

If the packages are installed properly, you should see lines

like

the

following in this log file:

Wed Sep 28 13:52:53 2005 JM_SCRIPT: Adding

"should_transfer_files

=

ALWAYS"

Wed Sep 28 13:52:53 2005 JM_SCRIPT: Adding

"WhenToTransferOutput

=

ON_EXIT"

Also, I don't see you specifying any transfer_input_file or
transfer_output_file extension elements.  You need those to

specify

which files to transfer to or from the compute node.  GRAM

will

not

set

those up for you automatically.

Peter



ii) I submitted the problem; from here we are working to

solve

this

problem.

iii) Now I commented out the above two lines which I added

in

condor.pm.

I submitted the XML Job description has the extensions tag

with

the

options condor:should_transfer_files = YES and
condor:WhenToTransferOutput = ON_EXIT. Now I back to stage

one,

that

is

the jobs are not running and idle bcos of the requirement

problem,

it

doesn't have the file transfer option even though we

specified

in

the

job description.

In conclusion, some thing is overriding the extensions tag

or

the

scripts like condor.pm are not understanding the extensions

part.

So after updating all these 4 packages, we are having the

same

problem.


Please help me on this, I am at University Of Pittsburgh,

you

can

reach

me at 412-624-6578.

Thanks,
Senthil



-----Original Message-----
From: Peter G Lane [mailto:lane@xxxxxxxxxxx]
Sent: Wednesday, September 28, 2005 2:48 PM
To: Natarajan, Senthil
Cc: discuss@xxxxxxxxxx
Subject: RE: [Globus-discuss] Problem in Job Submission to

Condor

through GT4

On Wed, 2005-09-28 at 14:04 -0400, Natarajan, Senthil wrote:

Peter,
Thanks lot and I appreciate your help.
I successfully installed those packages. I am trying to

execute

the

XML

Excellent!


Job description with the new extension tags(
condor:should_transfer_files , condor:transfer_output_file

etc)

Now

the

globus parser is not recognizing these new tags its giving

parser

error.



Looks like we need to include the Namespace of where these

new

tags

are

located. Did you include any additional namespace while

testing?


Interesting. My parser seems to be more lenient. This

should

be

a

simple matter to fix, though (famous last words, I know).

If

you

edit

the

$GLOBUS_LOCATION/lib/perl/Globus/GRAM/ExtensionsHandler.pm

file

and

search "new XML::Parser", you should be able to add a hash

parameter

to

turn off namespace processing.  Change the following:

my $extParser = new XML::Parser(Handlers => {
            Start       => sub { $self->StartTag(@_); },
            End         => sub { $self->EndTag(@_); },
            Char        => sub { $self->Char(@_); } });

to this:

my $extParser = new XML::Parser(
    Namespaces => 0,
    Handlers => {
            Start       => sub { $self->StartTag(@_); },
            End         => sub { $self->EndTag(@_); },
            Char        => sub { $self->Char(@_); } });

Either that or just put a bogus xmlns attribute in the

extensions

start

tag:

<extensions xmlns:condor="bogus_condor_namespace">
    <condor:should_transfer_files>...
    . . .
</extensions>

Let me know how it goes.

Peter



Here I am pasting the error and the XML Job description.

[senthil@gis14 GridTest]$ globusrun-ws -submit -F hostname

-f

multi_condor_test1.xml
globusrun-ws: Error loading rsl
globus_soap_message_module: Deserialization of
{http://www.w3.org/2001/XMLSchema}string failed.
globus_soap_message_module: XML parser failed to get the

next

node

in

the message
globus_soap_message_module: Parser Failed Namespace prefix

condor

on

should_transfer_files is not defined




<?xml version="1.0" encoding="UTF-8"?> <multiJob



xmlns:gram="http://www.globus.org/namespaces/2004/10/gram/job";



xmlns:wsa="http://schemas.xmlsoap.org/ws/2004/03/addressing";>

    <factoryEndpoint>
        <wsa:Address>


https://hostname:8443/wsrf/services/ManagedJobFactoryService

        </wsa:Address>
        <wsa:ReferenceProperties>
            <gram:ResourceID>Multi</gram:ResourceID>
        </wsa:ReferenceProperties>
    </factoryEndpoint>
    <directory>${GLOBUS_LOCATION}</directory>
    <count>1</count>

    <job>
        <factoryEndpoint>
            <wsa:Address>



https://hostname:8443/wsrf/services/ManagedJobFactoryService

            </wsa:Address>
            <wsa:ReferenceProperties>
                <gram:ResourceID>Condor</gram:ResourceID>
            </wsa:ReferenceProperties>
        </factoryEndpoint>


<executable>/home/senthil/GridTest/sh_loop</executable>

        <argument>60</argument>
        <stdout>${GLOBUS_USER_HOME}/stdout.j3</stdout>
        <stderr>${GLOBUS_USER_HOME}/stderr.j3</stderr>
        <count>1</count>
        <extensions>







<condor:should_transfer_files>IF_NEEDED</ condor:should_transfer_files>





<condor:WhenToTransferFiles>ON_EXIT</condor:WhenToTransferFiles>
















<condor:transfer_output_file>${GLOBUS_USER_HOME}/stdout.j3</ condor:trans

fer_output_file>















<condor:transfer_output_file>${GLOBUS_USER_HOME}/stderr.j3</ condor:trans

fer_output_file>
    </extensions>
  </job>

<job>
        <factoryEndpoint>
            <wsa:Address>



https://hostname:8443/wsrf/services/ManagedJobFactoryService

            </wsa:Address>
            <wsa:ReferenceProperties>
                <gram:ResourceID>Condor</gram:ResourceID>
            </wsa:ReferenceProperties>
        </factoryEndpoint>


<executable>/home/senthil/GridTest/sh_loop</executable>

        <argument>120</argument>
        <stdout>${GLOBUS_USER_HOME}/stdout.j4</stdout>
        <stderr>${GLOBUS_USER_HOME}/stderr.j4</stderr>
        <count>1</count>
    </job>

</multiJob>








-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Wednesday, September 28, 2005 12:05 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job Submission to

Condor

through GT4

On Wed, 2005-09-28 at 11:36 -0400, Natarajan, Senthil

wrote:

Peter,
I tried with different combinations the package
globus_wsrf_gram_service_java-0.78.0 went through.

But still other packages are giving problem; the problem

is

it

couldn't

find the file Doxyfile.in under the directory doxygen.

(Physically

I

couldn't find the file in that directory). Could you

please

help

me

on

this, I am kind of stuck.


Yeah, this is my fault. I wasn't creating source packages

correctly.

I

talked to the GPT gurus and created a new set of source

packages

that

should work.  I refreshed the downloads page so you should

see

the

section in question dated September 28.  Try those

packages.

Again,

sorry for the inconvenience.  It's a learning process for

me

too.

:)


Peter


Thanks,
Senthil

Here is the error msg.

[globus@gis14 globus-4.0.1]$ gpt-build
globus_scheduler_event_generator-1.0.1-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_scheduler_event_generator
gpt-build ====> Changing to
















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_scheduler_event_gener

ator-1.0.1/
gpt-build ====> BUILDING FLAVOR gcc32dbg
GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1;

export

GLOBUS_LOCATION;  CPP='/usr/bin/gcc -E'; export CPP;

CPPFLAGS='

-I/usr/local/GridComputing/globus-4.0.1/include


-I/usr/local/GridComputing/globus-4.0.1/include/gcc32dbg';

export

CPPFLAGS; CFLAGS='-g   -Wall'; export CFLAGS; LDFLAGS='
-L/usr/local/GridComputing/globus-4.0.1/lib'; export

LDFLAGS;

CXX='/usr/bin/g++'; export CXX; CXXCPP='/usr/bin/g++

-E';

export

CXXCPP;

CXXFLAGS='-g  '; export CXXFLAGS; F77='/usr/bin/g77';

export

F77;

AR='/usr/bin/ar'; export AR; ARFLAGS='ruv'; export

ARFLAGS;

RANLIB='/usr/bin/ranlib'; export RANLIB; NM='/usr/bin/nm

-B';

export

NM;

CC='/usr/bin/gcc'; export CC;
















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_scheduler_event_gener

ator-1.0.1//configure  --with-flavor=gcc32dbg
checking whether to enable maintainer-specific portions

of

Makefiles...

no
Dependencies Complete
checking for a BSD-compatible install...

/usr/bin/install

-c

checking whether build environment is sane... yes
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
configure: creating ./config.status
config.status: creating Makefile
config.status: creating pkgdata/Makefile
config.status: creating pkgdata/pkg_data_src.gpt
config.status: creating doxygen/Makefile
config.status: creating doxygen/Doxyfile
config.status: error: cannot find input file:

doxygen/Doxyfile.in



[globus@gis14 globus-4.0.1]$ gpt-build
globus_gram_job_manager-7.6.2-src.tar.gz gcc32dbg
gpt-build ====> CHECKING BUILD DEPENDENCIES FOR

globus_gram_job_manager

ERROR: The following packages are missing
Package globus_gram_job_manager-ANY-src is missing
pgm_link-globus_scheduler_event_generator-ANY-dev
Package globus_gram_job_manager-ANY-src is missing
compile-globus_scheduler_event_generator-ANY-dev

Died at

/usr/local/GridComputing/software/GPT/sbin/gpt-build

line

362,

<FILE> line 104566.

[globus@gis14 globus-4.0.1]$ gpt-build
globus_gram_job_manager_setup_condor-2.8.1-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_gram_job_manager_setup_condor
gpt-build ====> Changing to
















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1/
gpt-build ====> BUILDING

globus_gram_job_manager_setup_condor

GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1;

export

GLOBUS_LOCATION;
















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1//configure  --with-flavor=
checking whether to enable maintainer-specific portions

of

Makefiles...

no
Warning: package doesn't build with flavors  ignored
Warning:  ignored
Dependencies Complete
checking for a BSD-compatible install...

/usr/bin/install

-c

checking whether build environment is sane... yes
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
configure: creating ./config.status
config.status: creating Makefile
config.status: creating pkgdata/Makefile
config.status: creating pkgdata/pkg_data_src.gpt
config.status: creating doxygen/Doxyfile
config.status: error: cannot find input file:

doxygen/Doxyfile.in






-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Tuesday, September 27, 2005 6:38 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job Submission

to

Condor

through GT4

I believe I fixed the

globus_wsrf_gram_service_java-0.78.0

package

(replaced the package).  I'm still trying to work with

the

GPT

experts

to figure out why the scheduler_event_generator package

isn't

relocatable.

Peter

On Tue, 2005-09-27 at 16:01 -0400, Natarajan, Senthil

wrote:

Peter,
I am trying to install the packages in the following

order

globus_wsrf_gram_service_java-0.78.0
globus_scheduler_event_generator-1.0.1
before going for the main packages. But I am getting

the

following

error.
Please let me know am I trying correctly.

[globus@gis14 globus-4.0.1]$ gpt-build
globus_wsrf_gram_service_java-0.78.0-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_wsrf_gram_service_java
gpt-build ====> Changing to


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/
gpt-build ====> BUILDING globus_wsrf_gram_service_java
 ant

-Denv.GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1

Buildfile: build.xml
  [taskdef] Could not load definitions from resource

clovertasks.

It

could not be found.

initClover:

init:
    [mkdir] Created dir:


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/build
    [mkdir] Created dir:


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/build/classes
    [mkdir] Created dir:


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/build/lib

compile:
    [javac] Compiling 36 source files to


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/build/classes
    [javac]


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav



















a-0.78.0/src/org/globus/exec/service/job/ ManagedJobResourceImpl.java:259

: warning: non-varargs call of varargs method with

inexact

argument

type

for last parameter;
    [javac] cast to java.lang.Object for a varargs

call

    [javac] cast to java.lang.Object[] for a

non-varargs

call

and

to

suppress this warning
    [javac]                 value =

readMethod.invoke(resourceData,

null);
    [javac]

^

    [javac]


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav



















a-0.78.0/src/org/globus/exec/service/multi/ ManagedMultiJobResource.java:

385: warning: non-varargs call of varargs method with

inexact

argument

type for last parameter;
    [javac] cast to java.lang.Object for a varargs

call

    [javac] cast to java.lang.Object[] for a

non-varargs

call

and

to

suppress this warning
    [javac]

subJobDescriptions[index],

null);

    [javac]

^

    [javac]


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav



















a-0.78.0/src/org/globus/exec/service/multi/ ManagedMultiJobResource.java:

697: cannot find symbol
    [javac] symbol  : variable

SUBJOB_UNSUBSCRIBE_ERROR

    [javac] location: class

org.globus.exec.utils.Resources

    [javac]

Resources.SUBJOB_UNSUBSCRIBE_ERROR);

    [javac]                                  ^
    [javac]


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav



















a-0.78.0/src/org/globus/exec/service/multi/ ManagedMultiJobResource.java:

857: cannot find symbol
    [javac] symbol  : variable

SUBJOB_STUB_SECURITY_ERROR

    [javac] location: class

org.globus.exec.utils.Resources

    [javac]
Resources.SUBJOB_STUB_SECURITY_ERROR);
    [javac]                                          ^
    [javac] Note: Some input files use unchecked or

unsafe

operations.

    [javac] Note: Recompile with -Xlint:unchecked for

details.

    [javac] 2 errors
    [javac] 2 warnings

BUILD FAILED


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_wsrf_gram_service_jav

a-0.78.0/build.xml:39: Compile failed; see the

compiler

error

output

for

details.

Total time: 15 seconds

ERROR: Build has failed


[globus@gis14 globus-4.0.1]$ gpt-build globus_scheduler_event_generator-1.0.1-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_scheduler_event_generator
gpt-build ====> Changing to


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_scheduler_event_gener

ator-1.0.1/
gpt-build ====> BUILDING FLAVOR gcc32dbg
GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1;

export

GLOBUS_LOCATION;  CPP='/usr/bin/gcc -E'; export CPP;

CPPFLAGS='

-I/usr/local/GridComputing/globus-4.0.1/include


-I/usr/local/GridComputing/globus-4.0.1/include/gcc32dbg';

export

CPPFLAGS; CFLAGS='-g   -Wall'; export CFLAGS;

LDFLAGS='

-L/usr/local/GridComputing/globus-4.0.1/lib'; export

LDFLAGS;

CXX='/usr/bin/g++'; export CXX; CXXCPP='/usr/bin/g++

-E';

export

CXXCPP;

CXXFLAGS='-g  '; export CXXFLAGS; F77='/usr/bin/g77';

export

F77;

AR='/usr/bin/ar'; export AR; ARFLAGS='ruv'; export

ARFLAGS;

RANLIB='/usr/bin/ranlib'; export RANLIB;

NM='/usr/bin/nm

-B';

export

NM;

CC='/usr/bin/gcc'; export CC;


















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_scheduler_event_gener

ator-1.0.1//configure  --with-flavor=gcc32dbg
checking whether to enable maintainer-specific

portions

of

Makefiles...

no
Dependencies Complete
checking for a BSD-compatible install...

/usr/bin/install

-c

checking whether build environment is sane... yes
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
configure: creating ./config.status
config.status: creating Makefile
config.status: creating pkgdata/Makefile
config.status: creating pkgdata/pkg_data_src.gpt
config.status: creating doxygen/Makefile
config.status: creating doxygen/Doxyfile
config.status: error: cannot find input file:

doxygen/Doxyfile.in


ERROR: Build has failed






-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Tuesday, September 27, 2005 3:23 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job

Submission

to

Condor

through GT4

Hmm, I just downloaded the condor setup package I have

up

there

now

to

check it myself and installed it without a problem.

Try

downloading

fresh copies of all the packages you need.  Just for

sanity,

here's

the

MD5 checksum:

a37800228817290ccea05399ac986062
globus_gram_job_manager_setup_condor-2.8.1-src.tar.gz

Peter

On Tue, 2005-09-27 at 13:50 -0400, Natarajan, Senthil

wrote:

Peter,
I downloaded your new package (after bootstrap) then

I

try

to

install

it

gave me the following error.

[globus@gis14 globus-4.0.1]$ gpt-build


globus_gram_job_manager_setup_condor-2.8.1-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_gram_job_manager_setup_condor
gpt-build ====> Changing to




















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1/
gpt-build ====> BUILDING

globus_gram_job_manager_setup_condor



GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1;

export

GLOBUS_LOCATION;




















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1//configure  --with-flavor=
checking whether to enable maintainer-specific

portions

of

Makefiles...

no
Warning: package doesn't build with flavors  ignored
Warning:  ignored
Dependencies Complete
checking for a BSD-compatible install...

/usr/bin/install

-c

checking whether build environment is sane... yes
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
configure: creating ./config.status
config.status: creating Makefile
config.status: creating pkgdata/Makefile
config.status: creating pkgdata/pkg_data_src.gpt
config.status: creating doxygen/Doxyfile
config.status: error: cannot find input file:

doxygen/Doxyfile.in



I thought of doing ./bootstrap in the source

directory,

it

gave

the

following error.

[globus@gis14

globus_gram_job_manager_setup_condor-2.8.1]$

./bootstrap

installing globus_automake_pre link
installing globus_automake_post link
installing globus_automake_pre_top link
installing globus_automake_post_top link
running aclocal  -I


/usr/local/GridComputing/globus-4.0.1/share/globus_aclocal

-I



/usr/local/GridComputing/software/GPT/share/gpt/aclocal

running libtoolize --copy --force
running automake --copy -add-missing --foreign
configure.in: installing `./mkinstalldirs'
running gpt_create_automake_rules --excludes=doxygen
running autoconf
Can't locate object method "path" via package

"Autom4te::Request"

at

/usr/bin/autom4te line 81.

ERROR: bootstrap failed!


-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Tuesday, September 27, 2005 1:18 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job

Submission

to

Condor

through GT4

This one's my fault.  I forgot to bootstrap the

source

package

before

I

put it up on the web site.  Either download the

package

again

(I

replaced the package file) or untar the package and

execute

./bootstrap

in the source package directory.  You can then type

gpt-build

<flavor>

in the package directory.

I'm looking into the other package version problem.

The

major

version

number for the scheduler_event_generator package in

the

trunk

got

bumped

and I'm not sure why.  It might be as simple as

lowering

the

major

version requirement in the job_manager package

dependency,

but

I

want

to

make sure.

Peter

On Tue, 2005-09-27 at 11:08 -0400, Natarajan,

Senthil

wrote:

Peter,
I tried to install the package
globus_gram_job_manager_setup_condor-2.8.1, it

also

gave

the

following

error.

[globus@gis14 globus-4.0.1]$ gpt-build


globus_gram_job_manager_setup_condor-2.8.1-src.tar.gz

gcc32dbg

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR
globus_gram_job_manager_setup_condor
gpt-build ====> Changing to






















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1/
gpt-build ====> BUILDING

globus_gram_job_manager_setup_condor



GLOBUS_LOCATION=/usr/local/GridComputing/globus-4.0.1;

export

GLOBUS_LOCATION;






















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1//configure  --with-flavor=
sh: line 1:






















/usr/local/GridComputing/globus-4.0.1/BUILD/ globus_gram_job_manager_setu

p_condor-2.8.1//configure: No such file or

directory


ERROR: Build has failed

Please let me know is either the package is

missing

something

or

I

am

not updating correctly.
Thanks,
Senthil


-----Original Message----- From: Natarajan, Senthil Sent: Tuesday, September 27, 2005 10:22 AM To: 'Peter G Lane' Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job

Submission

to

Condor

through GT4

Peter,
As you mentioned, here I am trying to build the

package

from

Sep

23rd

(globus_gram_job_manager-7.6.2) first and then the

package

globus_gram_job_manager_setup_condor-2.8.1

But I am getting this error in the Sep 23rd

package

itself,

am

I

missing

something? I have this gpt (gpt-3.2autotools2004)

and

Globus4.0.1

version.

[globus@gis14 globus-4.0.1]$ gpt-build

globus_gram_job_manager-7.6.2-

src.tar.gz gcc32dbg
gpt-build ====> Changing to


/usr/local/GridComputing/globus-4.0.1/BUILD/globus_core-4.26/

gpt-build ====> BUILDING FLAVOR gcc32dbg
gpt-build ====> Changing to

/usr/local/GridComputing/globus-4.0.1/BUILD

gpt-build ====> REMOVING empty package

globus_core-gcc32dbg-pgm_static

gpt-build ====> REMOVING empty package

globus_core-noflavor-doc

gpt-build ====> CHECKING BUILD DEPENDENCIES FOR

globus_gram_job_manager

ERROR: The following packages are missing
Package

pgm_link-globus_scheduler_event_generator-ANY-dev

version

0.3

is

incompatible with globus_gram_job_manager-ANY-src
Package

compile-globus_scheduler_event_generator-ANY-dev

version

0.3

is

incompatible with globus_gram_job_manager-ANY-src

Died at

/usr/local/GridComputing/software/GPT/sbin/gpt-build

line

362,

<FILE> line 105362.




-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Monday, September 26, 2005 5:54 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: RE: [Globus-discuss] Problem in Job

Submission

to

Condor

through GT4

On Mon, 2005-09-26 at 15:32 -0400, Natarajan,

Senthil

wrote:

Hi Peter,
I am seeing the extensions for

"transfer_input_file",

similarly

does

it

has the "transfer_out_file" extension also in

order

to

transfer

the

output back.


No, but I just put up a new update package (globus_gram_job_manager_setup_condor-2.8.1) that

has

support

for

that

one.



To install the update package, do I need to

compile

using

the

Makefile

and Install, and this will update only the

condor

part

right?


All you should have to do is the following:

gpt-build <package file name> <flavor>

For example:

gpt-build

globus_gram_job_manager_setup_condor-2.8.1-src.tar.gz

gcc32dbg


Note that on the download page it says that it

requires

one

of

the

packages from September 23rd

(globus_gram_job_manager-7.6.2).

Make

sure

to get that one too.

Peter



Actually I did added the two extensions print SCRIPT_FILE "should_transfer_files =

IF_NEEDED\n";



print SCRIPT_FILE "WhenToTransferOutput =

ON_EXIT\n\n";

in

this

file

./lib/perl/Globus/GRAM/JobManager/condor.pm
in order to get the job execute in condor pool

(previously

I

had

problem

in executing the job).
Please let me know.
Thanks,
Senthil




-----Original Message----- From: Peter G Lane [mailto:lane@xxxxxxxxxxx] Sent: Monday, September 26, 2005 2:46 PM To: Natarajan, Senthil Cc: discuss@xxxxxxxxxx Subject: Re: [Globus-discuss] Problem in Job

Submission

to

Condor

through GT4

The Condor adapter doesn't automatically do

this.

We've

added

support

in update packages for extensions to the job

description

that

will

allow

you to set these parameters. See the Bugzilla

entry

about

this

here:




http://bugzilla.globus.org/bugzilla/show_bug.cgi?id=3773


Peter

On Mon, 2005-09-26 at 13:12 -0400, Natarajan,

Senthil

wrote:

Hi,

I am trying to submit a XML Job description to

the

Globus,

Globus

in

turn submits the job to the Condor.

Condor executes the job but it is not copying

back

the

output

file.

But if try to submit the job directly to

condor

pool,

it

copies

the

output file back to the specified directory.

Could

any

one

please

help

me on this how to get the output file back in

the

specified

directory.


Thanks,

Senthil



Here is the XML Job description.



<?xml version="1.0" encoding="UTF-8"?>

<multiJob




xmlns:gram="http://www.globus.org/namespaces/2004/10/gram/job";






xmlns:wsa="http://schemas.xmlsoap.org/ws/2004/03/addressing";>


<factoryEndpoint>

        <wsa:Address>




























https://gis14.exp.sis.pitt.edu:8443/wsrf/services/ ManagedJobFactoryServi

ce


</wsa:Address>

        <wsa:ReferenceProperties>



<gram:ResourceID>Multi</gram:ResourceID>


</wsa:ReferenceProperties>

    </factoryEndpoint>

    <directory>${GLOBUS_LOCATION}</directory>

    <count>1</count>



    <job>

        <factoryEndpoint>

            <wsa:Address>




https://hostname:8443/wsrf/services/ManagedJobFactoryService


</wsa:Address>

            <wsa:ReferenceProperties>



<gram:ResourceID>Condor</gram:ResourceID>


</wsa:ReferenceProperties>

        </factoryEndpoint>



<executable>/home/senthil/GridTest/sh_loop</executable>


<argument>60</argument>



<stdout>${GLOBUS_USER_HOME}/stdout.j3</stdout>




<stderr>${GLOBUS_USER_HOME}/stderr.j3</stderr>


<count>1</count>

    </job>



    <job>

        <factoryEndpoint>

            <wsa:Address>

                https://


hostname:8443/wsrf/services/ManagedJobFactoryService


</wsa:Address>

            <wsa:ReferenceProperties>

<gram:ResourceID>Condor</gram:ResourceID>

            </wsa:ReferenceProperties>

        </factoryEndpoint>



<executable>/home/senthil/GridTest/sh_loop</executable>


<argument>120</argument>



<stdout>${GLOBUS_USER_HOME}/stdout.j4</stdout>




<stderr>${GLOBUS_USER_HOME}/stderr.j4</stderr>


<count>1</count>

    </job>



</multiJob>






- To Unsubscribe: send mail to

majordomo@xxxxxxxxxx

with "unsubscribe discuss" in the body of the

message

















<condor.pm>
<sh_loop.txt>
<job.xml>
<logfile.txt>
<condor_job.txt>
<condor_job.out.txt>