[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[HTCondor-users] HTCondor claims more slots than necessary



Dear all,

I run the job which contains only five tasks, but htcondor claims six slots to run the job, then after CLAIM_WORKLIFE all five used slots are released but the sixth remains claimed and release only after UNUSED_CLAIM_TIMEOUT. Please, any ideas why HTCondor claimed more slots than necessary? Logs in attachments.

Thanks in advance,
Dmitry.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.797399562Z condor_startd[123]: slot1_1: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.797871979Z condor_startd[123]: slot1_1: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.799848959Z condor_procd[121]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.799855451Z condor_procd[121]: unregistering family with root pid 193
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.799858582Z condor_startd[123]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.799862038Z condor_startd[123]: Can't read ClaimId
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:51.799865782Z condor_startd[123]: Error: problem finding resource for 403 (DEACTIVATE_CLAIM)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:55.799843234Z condor_procd[121]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:55.799862916Z condor_procd[121]: ProcAPI: read 9 pid entries out of 72 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:58:55.800320134Z condor_procd[121]: ...snapshot complete
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:58.687272523Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:55.259024903Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.416995721Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.418000528Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.419726001Z condor_collector[297]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.419894716Z condor_collector[50]: QueryWorker: forked new worker with id 297 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.420657228Z condor_collector[297]: Query info: matched=3; skipped=0; query_time=0.000843; send_time=0.000748; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:40827>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.437836213Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.438401494Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:55.259894261Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.438420527Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:58:59.438510693Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000055; send_time=0.000077; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:40365>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:56.281586668Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:56.282135823Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:57.332387939Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:57.333211462Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:58.388669663Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:58.390099897Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:59.440614962Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:58:59.441294923Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:49.843050044Z condor_startd[122]: slot1_1: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:49.843053197Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:49.843056262Z condor_procd[120]: unregistering family with root pid 193
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:51.788330441Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776068#3#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:51.788356043Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:51.791377105Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776068#1#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:51.791401312Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:55.847431479Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:55.847454623Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:58:55.847756480Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:49.843407458Z condor_startd[122]: slot1_2: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:49.843723974Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:49.843756707Z condor_procd[120]: unregistering family with root pid 192
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:51.789136149Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776069#3#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:51.789543005Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:51.791412629Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776069#1#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:51.791424024Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:55.847431436Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:55.847450077Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:58:55.847745929Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.449676456Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.449900799Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.450581956Z condor_collector[50]: QueryWorker: forked new worker with id 298 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.450598881Z condor_collector[298]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.450874679Z condor_collector[298]: Query info: matched=3; skipped=0; query_time=0.000319; send_time=0.000229; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:38499>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.459687188Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:00.461167353Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.459920534Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.461118597Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:00.461134787Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000072; send_time=0.000047; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39747>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:00.461992802Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.471375220Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.471617785Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.472168486Z condor_collector[299]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.472181825Z condor_collector[50]: QueryWorker: forked new worker with id 299 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.472393615Z condor_collector[299]: Query info: matched=3; skipped=0; query_time=0.000266; send_time=0.000213; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36067>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.479687811Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:01.481248652Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.480045785Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.480603118Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:01.480614378Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000034; send_time=0.000037; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41689>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:01.481752502Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.135179089Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.136245027Z condor_collector[50]: QueryWorker: forked new worker with id 300 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.136315186Z condor_collector[300]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.136873200Z condor_collector[300]: Query info: matched=3; skipped=0; query_time=0.000354; send_time=0.000804; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.120:35011>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.491553993Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.491755544Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.492475482Z condor_collector[301]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.492488208Z condor_collector[50]: QueryWorker: forked new worker with id 301 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.492704941Z condor_collector[301]: Query info: matched=3; skipped=0; query_time=0.000318; send_time=0.000221; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34539>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.499684674Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.501379497Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.501804879Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.500291409Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.500305794Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.500309084Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000029; send_time=0.000050; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:42633>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.851755439Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.851774629Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.852041297Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000030; send_time=0.000057; type=Scheduler; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.120:45549>; projection={MyAddress AddressV1 CondorVersion CondorPlatform Name Machine}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.889838724Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.890070226Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000040; send_time=0.000130; type=Negotiator; requirements={true}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:34281>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.891316100Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.891348407Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.891353982Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.891511965Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.891525774Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892744929Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892772486Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892777783Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892780993Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892784180Z condor_collector[50]: QueryWorker: forked new high priority worker with id 302 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.892787445Z condor_collector[302]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.893908019Z condor_collector[302]: Query info: matched=3; skipped=0; query_time=0.000284; send_time=0.000178; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:44925>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.893928398Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.893932755Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.894128109Z condor_collector[50]: QueryWorker: forked new worker with id 303 ( max 4 active 2 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.894139637Z condor_collector[303]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.895969800Z condor_collector[50]: QueryWorker: forked new high priority worker with id 304 ( max 4 active 3 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.895997025Z condor_collector[303]: Query info: matched=3; skipped=0; query_time=0.000262; send_time=0.000769; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:44025>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.896003426Z condor_collector[304]: (Sending 5 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.896006997Z condor_collector[304]: Query info: matched=5; skipped=10; query_time=0.000252; send_time=0.000663; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:36507>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.896591654Z condor_negotiator[51]:   Sorting 5 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.897128374Z condor_negotiator[51]: Got ads: 5 public and 3 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.897139384Z condor_negotiator[51]: Public ads include 0 submitter, 3 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.897144009Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.897690396Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.897701761Z condor_negotiator[51]: Starting prefetch round; 0 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.898134745Z condor_negotiator[51]: Prefetch summary: 0 attempted, 0 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.898145102Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.898149023Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:02.898152731Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.897085837Z condor_schedd[128]: Found 3 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:02.897201450Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.511134362Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.511547082Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.512093339Z condor_collector[50]: QueryWorker: forked new worker with id 305 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.512108789Z condor_collector[305]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.512322259Z condor_collector[305]: Query info: matched=3; skipped=0; query_time=0.000297; send_time=0.000226; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:33207>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.525071177Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:03.526538897Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.525455345Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.525637547Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:03.525647544Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000035; send_time=0.000044; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:44069>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:03.529489536Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.533733952Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.533962888Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000048; send_time=0.000175; type=Negotiator; requirements={true}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:35975>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.535620211Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.536085882Z condor_collector[50]: QueryWorker: forked new worker with id 306 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.536162550Z condor_collector[306]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.536722232Z condor_collector[306]: Query info: matched=3; skipped=0; query_time=0.000293; send_time=0.000524; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:38655>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.534929919Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.534957989Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.534963812Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.534967814Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.534972903Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.537199140Z condor_schedd[128]: Found 3 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.537305970Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.540130098Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.540723978Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.541132546Z condor_collector[307]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.541142717Z condor_collector[50]: QueryWorker: forked new worker with id 307 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.541324267Z condor_collector[307]: Query info: matched=3; skipped=0; query_time=0.000314; send_time=0.000163; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:42341>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.548125810Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.549696948Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.548425453Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.548439892Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:04.548516875Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000037; send_time=0.000035; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:37077>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:04.550739682Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.559665753Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.559921078Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.560586714Z condor_collector[50]: QueryWorker: forked new worker with id 308 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.560599055Z condor_collector[308]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.560848477Z condor_collector[308]: Query info: matched=3; skipped=0; query_time=0.000317; send_time=0.000194; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41241>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.569111255Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:05.570493724Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.569388613Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.569421748Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:05.569569752Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000027; send_time=0.000051; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:38997>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:05.571253995Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.535781610Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.535846176Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.535868751Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.537582133Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.539540933Z condor_collector[50]: QueryWorker: forked new high priority worker with id 309 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.539587775Z condor_collector[309]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.540316447Z condor_collector[309]: Query info: matched=3; skipped=0; query_time=0.001020; send_time=0.000624; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:35847>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.540777782Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.544324850Z condor_collector[310]: (Sending 5 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.544376984Z condor_collector[50]: QueryWorker: forked new high priority worker with id 310 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.547357905Z condor_collector[310]: Query info: matched=5; skipped=10; query_time=0.000947; send_time=0.002866; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:45647>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.549587546Z condor_negotiator[51]:   Sorting 5 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.549876257Z condor_negotiator[51]: Got ads: 5 public and 3 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.549922428Z condor_negotiator[51]: Public ads include 1 submitter, 3 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.549999961Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.559178579Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.559486895Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.558689222Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.558751655Z condor_negotiator[51]: Starting prefetch round; 1 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.558766914Z condor_negotiator[51]: Starting prefetch negotiation for DedicatedScheduler@parallel_schedd@submit.pseven-htcondor.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563571430Z condor_negotiator[51]:     Got NO_MORE_JOBS;  schedd has no more requests
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563771478Z condor_negotiator[51]: Prefetch summary: 1 attempted, 1 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563829133Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563842533Z condor_negotiator[51]:   Negotiating with DedicatedScheduler@parallel_schedd@submit.pseven-htcondor at <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563856396Z condor_negotiator[51]: 0 seconds so far for this submitter
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.563866914Z condor_negotiator[51]: 0 seconds so far for this schedd
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.559664885Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.564121959Z condor_negotiator[51]:     Request 00002.00000: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567296210Z condor_schedd[128]: Finished sending rrls to negotiator
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567340184Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 0 matched, 0 rejected
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567420677Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567568974Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567700129Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567883066Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.567894614Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.572995374Z condor_schedd[128]: Found 3 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.573041244Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.573459532Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572473775Z condor_negotiator[51]:       Matched 2.0 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.121:46787?addrs=10.42.0.121-46787&alias=pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x.pseven-htcondor&noUDP&sock=startd_87_913c> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572510472Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572519453Z condor_negotiator[51]:     Request 00002.00001: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572525995Z condor_negotiator[51]:       Matched 2.1 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.573490167Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.573807577Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572535094Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572541185Z condor_negotiator[51]:     Request 00002.00002: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572546864Z condor_negotiator[51]:       Matched 2.2 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572555885Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572863534Z condor_negotiator[51]:     Request 00002.00003: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572876784Z condor_negotiator[51]:       Rejected 2.3 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3>: 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572884286Z condor_negotiator[51]: no match found
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572889906Z condor_negotiator[51]:     Request 00002.00004: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572895705Z condor_negotiator[51]:       Rejected 2.4 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3>: 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572902222Z condor_negotiator[51]: no match found
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572907556Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 1 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572913327Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572918685Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572923990Z condor_collector[50]: QueryWorker: forked new worker with id 311 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572929764Z condor_collector[311]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.572936302Z condor_collector[311]: Query info: matched=3; skipped=0; query_time=0.000520; send_time=0.001432; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:34999>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.578894127Z condor_schedd[128]: Negotiation ended - 3 jobs matched
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.578915337Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 3 matched, 2 rejected
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:06.582145582Z condor_startd[122]: slot1_1: New machine resource of type -1 allocated
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:06.583413593Z condor_startd[122]: slot1_1: New machine resource of type -1 allocated
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.584706047Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.585735843Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.586210436Z condor_collector[50]: QueryWorker: forked new worker with id 312 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.586221761Z condor_collector[312]: (Sending 3 ads in response to query)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:06.585736125Z condor_startd[123]: slot1_1: New machine resource of type -1 allocated
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:06.585758445Z condor_startd[123]: slot1_1: Request accepted.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.584724201Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.584785883Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.584876791Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.584891216Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:06.584081794Z condor_startd[122]: slot1_1: Request accepted.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:06.584637003Z condor_startd[122]: slot1_1: Request accepted.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.587272067Z condor_collector[312]: Query info: matched=3; skipped=0; query_time=0.000336; send_time=0.000862; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:43471>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.587942409Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.588281326Z condor_schedd[128]: Found 3 potential dedicated resources in 0 seconds
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.588460041Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.588489239Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.588980894Z condor_collector[50]: QueryWorker: forked new worker with id 313 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.588990916Z condor_collector[313]: (Sending 3 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.589181117Z condor_collector[313]: Query info: matched=3; skipped=0; query_time=0.000237; send_time=0.000171; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34091>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.596503167Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.597777265Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:06.597928479Z condor_startd[122]: slot1_1: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:06.597983137Z condor_startd[122]: slot1_1: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:06.598006331Z condor_startd[122]: slot1_1: Changing state: Owner -> Claimed
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.596716724Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.596810809Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.596819050Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000027; send_time=0.000039; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:35451>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:06.601686067Z condor_startd[123]: slot1_1: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:06.601704516Z condor_startd[123]: slot1_1: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:06.601707822Z condor_startd[123]: slot1_1: Changing state: Owner -> Claimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:06.601739541Z condor_startd[122]: slot1_1: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:06.601768560Z condor_startd[122]: slot1_1: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:06.601772444Z condor_startd[122]: slot1_1: Changing state: Owner -> Claimed
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:06.598486679Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601731897Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601753998Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601756719Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601759169Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601761643Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.121 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:06.601764082Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.121 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.628164762Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.629859259Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.631279083Z condor_collector[314]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.631920977Z condor_collector[50]: QueryWorker: forked new worker with id 314 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.632280229Z condor_collector[314]: Query info: matched=6; skipped=0; query_time=0.000981; send_time=0.000869; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:37733>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.645285863Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:07.647574811Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.645831607Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.645851088Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:07.645857227Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000046; send_time=0.000072; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41209>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:07.648459785Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.656551123Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.656980868Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000131; send_time=0.000376; type=Negotiator; requirements={true}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:33155>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.660015250Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.660239224Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.660313291Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.660409657Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.662309575Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.664715004Z condor_collector[315]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.664755983Z condor_collector[50]: QueryWorker: forked new worker with id 315 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.667436047Z condor_collector[315]: Query info: matched=6; skipped=0; query_time=0.000889; send_time=0.003228; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:35311>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.660705234Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.671600706Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.671548145Z condor_schedd[128]: Found 6 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.671741578Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.672951374Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.674714471Z condor_collector[316]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.674955968Z condor_collector[50]: QueryWorker: forked new worker with id 316 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.674982099Z condor_collector[316]: Query info: matched=6; skipped=0; query_time=0.000837; send_time=0.000667; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39883>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.685693260Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.687323039Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.685946956Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.685998454Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:08.686083261Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000040; send_time=0.000050; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34113>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:08.687865554Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.715401142Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.716707280Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.719456921Z condor_collector[317]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.719503328Z condor_collector[50]: QueryWorker: forked new worker with id 317 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.720390198Z condor_collector[317]: Query info: matched=6; skipped=0; query_time=0.001182; send_time=0.000891; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:46787>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.738359047Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:09.741067163Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.738867434Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.738901448Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:09.738970060Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000055; send_time=0.000071; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:40071>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:09.742041853Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.661995308Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.662062541Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.662066834Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.663204650Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.664213265Z condor_collector[50]: QueryWorker: forked new high priority worker with id 318 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.664265383Z condor_collector[318]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.664556686Z condor_collector[318]: Query info: matched=6; skipped=0; query_time=0.000412; send_time=0.000343; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:46251>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.664762134Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.666349126Z condor_collector[50]: QueryWorker: forked new high priority worker with id 319 ( max 4 active 2 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.666399938Z condor_collector[319]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.667535718Z condor_collector[319]: Query info: matched=8; skipped=10; query_time=0.000414; send_time=0.001270; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:33191>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.668925834Z condor_negotiator[51]:   Sorting 8 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.669013352Z condor_negotiator[51]: Got ads: 8 public and 6 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.669025985Z condor_negotiator[51]: Public ads include 1 submitter, 6 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.669067037Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.673270142Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.673282907Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.673338880Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.673086895Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.673111061Z condor_negotiator[51]: Starting prefetch round; 1 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.673115963Z condor_negotiator[51]: Starting prefetch negotiation for DedicatedScheduler@parallel_schedd@submit.pseven-htcondor.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674526237Z condor_negotiator[51]:     Got NO_MORE_JOBS;  schedd has no more requests
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674545646Z condor_negotiator[51]: Prefetch summary: 1 attempted, 1 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674549728Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674553001Z condor_negotiator[51]:   Negotiating with DedicatedScheduler@parallel_schedd@submit.pseven-htcondor at <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674566193Z condor_negotiator[51]: 0 seconds so far for this submitter
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674570164Z condor_negotiator[51]: 0 seconds so far for this schedd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674574236Z condor_negotiator[51]:     Request 00002.00003: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674753048Z condor_negotiator[51]:       Matched 2.3 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674887140Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.674955943Z condor_negotiator[51]:     Request 00002.00004: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.675274265Z condor_negotiator[51]:       Matched 2.4 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.675424566Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.675698778Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.675705616Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.676838014Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.677461911Z condor_collector[320]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.677481067Z condor_collector[50]: QueryWorker: forked new worker with id 320 ( max 4 active 1 pending 0 )
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676436842Z condor_schedd[128]: Finished sending rrls to negotiator
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.678465221Z condor_collector[320]: Query info: matched=6; skipped=0; query_time=0.000297; send_time=0.001116; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:45421>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676461775Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 0 matched, 0 rejected
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676466848Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676470721Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676474465Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676478426Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.676482056Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.680066685Z condor_schedd[128]: Found 6 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.680632980Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.681723481Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.681783801Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.681789380Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.684494338Z condor_schedd[128]: Negotiation ended - 2 jobs matched
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.684518898Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 2 matched, 0 rejected
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:10.685992697Z condor_startd[122]: slot1_2: New machine resource of type -1 allocated
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.688195213Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.688222534Z condor_collector[50]: QueryWorker: forked new worker with id 321 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.688228041Z condor_collector[321]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.689227661Z condor_collector[321]: Query info: matched=6; skipped=0; query_time=0.000311; send_time=0.001047; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:37167>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.686471992Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.686508389Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:10.686709765Z condor_startd[122]: slot1_2: Request accepted.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.686586738Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.686593978Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.686628486Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.689916255Z condor_schedd[128]: Found 6 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.690181768Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:10.686135877Z condor_startd[122]: slot1_2: New machine resource of type -1 allocated
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:10.686157823Z condor_startd[122]: slot1_2: Request accepted.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.692646412Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.692672441Z condor_collector[50]: QueryWorker: forked new worker with id 322 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.692675588Z condor_collector[322]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.693385815Z condor_collector[322]: Query info: matched=6; skipped=0; query_time=0.000347; send_time=0.000928; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:37383>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.692694166Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.692701251Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.692703605Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.692705759Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.692711021Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.694269189Z condor_schedd[128]: Found 6 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.694817887Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:10.701095079Z condor_startd[122]: slot1_2: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:10.701120989Z condor_startd[122]: slot1_2: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:10.701126346Z condor_startd[122]: slot1_2: Changing state: Owner -> Claimed
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.702554958Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.702576483Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:10.703465161Z condor_startd[122]: slot1_2: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.705233543Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.705250046Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:10.705118359Z condor_startd[122]: slot1_2: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:10.705147411Z condor_startd[122]: slot1_2: Changing state: Owner -> Claimed
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.752418807Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.752662560Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.753206594Z condor_collector[50]: QueryWorker: forked new worker with id 323 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.753255093Z condor_collector[323]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.754708499Z condor_collector[323]: Query info: matched=8; skipped=0; query_time=0.000271; send_time=0.000225; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:37009>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.762065410Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.763781185Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.762287362Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.762331327Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:10.762412009Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000034; send_time=0.000046; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:38223>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:10.764341369Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.773517187Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.773806385Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.774289480Z condor_collector[50]: QueryWorker: forked new worker with id 324 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.774439212Z condor_collector[324]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.774584470Z condor_collector[324]: Query info: matched=8; skipped=0; query_time=0.000280; send_time=0.000180; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:46723>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.782258788Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:11.783636593Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.782510415Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.782526839Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:11.782550757Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000020; send_time=0.000032; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36913>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:11.784129124Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.794334110Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.795369789Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000138; send_time=0.000462; type=Negotiator; requirements={true}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:43561>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.799400540Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.799474235Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.802209427Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.803833451Z condor_collector[50]: QueryWorker: forked new worker with id 325 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.803880856Z condor_collector[325]: (Sending 8 ads in response to query)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.800043086Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.800088819Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.800901391Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.809342502Z condor_collector[325]: Query info: matched=8; skipped=0; query_time=0.001008; send_time=0.005174; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:33055>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.813153735Z condor_schedd[128]: Found 8 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.815346784Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.813946314Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.815433069Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.817561304Z condor_collector[326]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.817751970Z condor_collector[50]: QueryWorker: forked new worker with id 326 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.818677707Z condor_collector[326]: Query info: matched=8; skipped=0; query_time=0.001000; send_time=0.001067; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41059>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.831688799Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.833814570Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.832154707Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.832177405Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:12.832277221Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000047; send_time=0.000073; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:46877>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:12.834449528Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.861291356Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.862586149Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.864238959Z condor_collector[327]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.864296980Z condor_collector[50]: QueryWorker: forked new worker with id 327 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.865208793Z condor_collector[327]: Query info: matched=8; skipped=0; query_time=0.001037; send_time=0.000920; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:38927>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.888799863Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:13.894485903Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.889964038Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.890061750Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:13.890308226Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000156; send_time=0.000183; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:33485>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:13.896831900Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.799720582Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.799794848Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.799817427Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.801594861Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.803490553Z condor_collector[50]: QueryWorker: forked new high priority worker with id 328 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.803538911Z condor_collector[328]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.804748892Z condor_collector[328]: Query info: matched=8; skipped=0; query_time=0.001058; send_time=0.000962; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:42695>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.805566393Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.809205208Z condor_collector[50]: QueryWorker: forked new high priority worker with id 329 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.809256291Z condor_collector[329]: (Sending 10 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.812934117Z condor_collector[329]: Query info: matched=10; skipped=10; query_time=0.001000; send_time=0.003573; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:36875>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.817772714Z condor_negotiator[51]:   Sorting 10 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.818170438Z condor_negotiator[51]: Got ads: 10 public and 8 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.818216295Z condor_negotiator[51]: Public ads include 1 submitter, 8 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.818352688Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.830404369Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.830450483Z condor_negotiator[51]: Starting prefetch round; 1 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.830463021Z condor_negotiator[51]: Starting prefetch negotiation for DedicatedScheduler@parallel_schedd@submit.pseven-htcondor.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.830766423Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.830794926Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.831055578Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833127863Z condor_negotiator[51]:     Got NO_MORE_JOBS;  schedd has no more requests
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833183256Z condor_negotiator[51]: Prefetch summary: 1 attempted, 1 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833193750Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833202128Z condor_negotiator[51]:   Negotiating with DedicatedScheduler@parallel_schedd@submit.pseven-htcondor at <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833212110Z condor_negotiator[51]: 0 seconds so far for this submitter
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833219568Z condor_negotiator[51]: 0 seconds so far for this schedd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.833226831Z condor_negotiator[51]:     Request 00002.00004: autocluster -1 (request count 1 of 0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.835123028Z condor_negotiator[51]:       Matched 2.4 DedicatedScheduler@parallel_schedd@submit.pseven-htcondor <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=schedd_86_e1a3> preempting none <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.835154646Z condor_negotiator[51]:       Successfully matched with slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.835164282Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.835171957Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835032613Z condor_schedd[128]: Finished sending rrls to negotiator
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835056621Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 0 matched, 0 rejected
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835141049Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835204364Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835326620Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835337575Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.835356008Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.836483953Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.838324541Z condor_collector[50]: QueryWorker: forked new worker with id 330 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.838344661Z condor_collector[330]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.839579698Z condor_collector[330]: Query info: matched=8; skipped=0; query_time=0.000540; send_time=0.002010; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:37743>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.841245193Z condor_schedd[128]: Found 8 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.842136903Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.842432262Z condor_schedd[128]: Activity on stashed negotiator socket: <10.42.0.115:38813>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.842499340Z condor_schedd[128]: Using negotiation protocol: NEGOTIATE
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.842518249Z condor_schedd[128]: Negotiating for owner: DedicatedScheduler@parallel_schedd@submit.pseven-htcondor
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.844060422Z condor_schedd[128]: Negotiation ended - 1 jobs matched
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.844077322Z condor_schedd[128]: Finished negotiating for DedicatedScheduler@parallel_schedd in local pool: 1 matched, 0 rejected
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:14.845513381Z condor_startd[122]: slot1_3: New machine resource of type -1 allocated
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.848276755Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.847319093Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.847342766Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:14.846932487Z condor_startd[122]: slot1_3: Request accepted.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.847371183Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.847446587Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.847468338Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.849090966Z condor_collector[50]: QueryWorker: forked new worker with id 331 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.849106592Z condor_collector[331]: (Sending 8 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.850640982Z condor_collector[331]: Query info: matched=8; skipped=0; query_time=0.000335; send_time=0.001585; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:36399>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.852529582Z condor_schedd[128]: Found 8 potential dedicated resources in 0 seconds
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.853303229Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:14.863719935Z condor_startd[122]: slot1_3: Remote owner is user20001@pseven-htcondor
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:14.863864421Z condor_startd[122]: slot1_3: State change: claiming protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:14.863890481Z condor_startd[122]: slot1_3: Changing state: Owner -> Claimed
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.867391074Z condor_collector[50]: StartdAd     : Inserting ** "< slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.867437031Z condor_collector[50]: StartdPvtAd  : Inserting ** "< slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.916895996Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.917575785Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.918683912Z condor_collector[50]: QueryWorker: forked new worker with id 332 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.918713931Z condor_collector[332]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.919317614Z condor_collector[332]: Query info: matched=9; skipped=0; query_time=0.000523; send_time=0.000536; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:38127>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.934317761Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.936418978Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.934748406Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.934771525Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:14.934861920Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000042; send_time=0.000081; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:45537>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:14.937584698Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.964360719Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.965566028Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.967057543Z condor_collector[50]: QueryWorker: forked new worker with id 333 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.967145438Z condor_collector[333]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.968094438Z condor_collector[333]: Query info: matched=9; skipped=0; query_time=0.000849; send_time=0.000828; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36125>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.992173711Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:15.997578616Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.993353878Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.993412624Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:15.993642420Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000127; send_time=0.000172; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:44277>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:16.000684100Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.011055509Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.011749389Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000161; send_time=0.000644; type=Negotiator; requirements={true}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:42839>; projection={}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.015411693Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.015486723Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.015511358Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.015644696Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.015745136Z condor_schedd[128]: SetAttribute modifying attribute Scheduler in nonexistent job 2.5
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.017746918Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.019990081Z condor_collector[50]: QueryWorker: forked new worker with id 334 ( max 4 active 1 pending 0 )
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.029988100Z condor_schedd[128]: Found 9 potential dedicated resources in 0 seconds
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.020058609Z condor_collector[334]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.024825962Z condor_collector[334]: Query info: matched=9; skipped=0; query_time=0.001210; send_time=0.004716; type=Machine; requirements={((DedicatedScheduler == "DedicatedScheduler@parallel_schedd@submit.pseven-htcondor"))}; locate=0; limit=0; from=SCHEDD; peer=<10.42.0.102:42609>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.033294782Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.034387563Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.034745293Z condor_collector[50]: QueryWorker: forked new worker with id 335 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.034776362Z condor_collector[335]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.035669026Z condor_collector[335]: Query info: matched=9; skipped=0; query_time=0.000819; send_time=0.000778; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:40931>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.044117878Z condor_schedd[128]: Adding submitter DedicatedScheduler@parallel_schedd@submit.pseven-htcondor to the submitter map for default pool.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.045476248Z condor_schedd[128]: Starting add_shadow_birthdate(2.0)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.049227898Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.049684480Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.049718275Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:17.049789835Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000045; send_time=0.000062; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34645>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.051543543Z condor_schedd[128]: Started shadow for job 2.0 on slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> for DedicatedScheduler@parallel_schedd, (shadow pid = 641)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057303481Z condor_shadow[641]: ******************************************************
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057335531Z condor_shadow[641]: ** condor_shadow (CONDOR_SHADOW) STARTING UP
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057343075Z condor_shadow[641]: ** /usr/sbin/condor_shadow
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057348905Z condor_shadow[641]: ** SubsystemInfo: name=SHADOW type=SHADOW(6) class=DAEMON(1)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057353851Z condor_shadow[641]: ** Configuration: subsystem:SHADOW local:<NONE> class:DAEMON
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057359252Z condor_shadow[641]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057364809Z condor_shadow[641]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057370245Z condor_shadow[641]: ** PID = 641
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057376924Z condor_shadow[641]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057382174Z condor_shadow[641]: ******************************************************
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057387630Z condor_shadow[641]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057392482Z condor_shadow[641]: Using local config sources: 
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057396558Z condor_shadow[641]:    /etc/condor/condor_config.local
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057400179Z condor_shadow[641]: config Macros = 113, Sorted = 113, StringBytes = 2769, TablesBytes = 1856
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057405065Z condor_shadow[641]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057479491Z condor_shadow[641]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.057580596Z condor_shadow[641]: SharedPortEndpoint: waiting for connections to named socket shadow_128_effe_2
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.058508775Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.058525091Z condor_shadow[641]: DaemonCore: command socket at <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.058532068Z condor_shadow[641]: DaemonCore: private command socket at <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.059419777Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.060167654Z condor_shadow[641]: Initializing a PARALLEL shadow for job 2.0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:17.060234576Z condor_shadow[641]: LIMIT_DIRECTORY_ACCESS = <unset>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.016918618Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.016983967Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.016999390Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.018603515Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.020418254Z condor_collector[50]: QueryWorker: forked new high priority worker with id 336 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.020551250Z condor_collector[336]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.021790665Z condor_collector[336]: Query info: matched=9; skipped=0; query_time=0.001100; send_time=0.001091; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:46033>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.022471387Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.026520917Z condor_collector[50]: QueryWorker: forked new high priority worker with id 337 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.026577333Z condor_collector[337]: (Sending 11 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.030302883Z condor_collector[337]: Query info: matched=11; skipped=10; query_time=0.001038; send_time=0.004300; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:39443>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.036063757Z condor_negotiator[51]:   Sorting 11 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.036562685Z condor_negotiator[51]: Got ads: 11 public and 9 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.036608730Z condor_negotiator[51]: Public ads include 0 submitter, 9 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.036757869Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046114853Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046161420Z condor_negotiator[51]: Starting prefetch round; 0 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046178121Z condor_negotiator[51]: Prefetch summary: 0 attempted, 0 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046194330Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046209988Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.046335436Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.076284217Z condor_startd[122]: slot1_3: Got activate_claim request from shadow (10.42.0.102)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.077255774Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.080876508Z condor_shadow[641]: Request to run on slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> was ACCEPTED
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.078059172Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.078746396Z condor_collector[50]: QueryWorker: forked new worker with id 338 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.078920175Z condor_collector[338]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.079553383Z condor_collector[338]: Query info: matched=9; skipped=0; query_time=0.000586; send_time=0.000587; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34557>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.078347362Z condor_startd[122]: slot1_3: Remote job ID is 2.0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086348731Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086410532Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086423993Z condor_procd[120]: ProcAPI: read 9 pid entries out of 72 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086434806Z condor_procd[120]: process 191 (not in monitored family) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086444544Z condor_procd[120]: method PARENT: found family 122 for process 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086454285Z condor_procd[120]: method PARENT: found family 122 for process 265 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086464627Z condor_procd[120]: no methods have determined process 264 to be in a monitored family
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086474397Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086483311Z condor_procd[120]: moving process 265 into new subfamily 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086492841Z condor_procd[120]: new subfamily registered: root = 265, watcher = 122
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.086502567Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.092134926Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.093254516Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.093273672Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:18.093281422Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000042; send_time=0.000067; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34385>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.096082449Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.096103285Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.501216684Z condor_startd[122]: slot1_3: Got universe "PARALLEL" (11) from request classad
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.501238712Z condor_startd[122]: slot1_3: State change: claim-activation protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.501255218Z condor_startd[122]: slot1_3: Changing activity: Idle -> Busy
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.501512513Z condor_startd[122]: slot1_2: Got activate_claim request from shadow (10.42.0.102)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.502789282Z condor_shadow[641]: Request to run on <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> was ACCEPTED
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.503010332Z condor_startd[122]: slot1_2: Got activate_claim request from shadow (10.42.0.102)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.502789307Z condor_startd[122]: slot1_2: Remote job ID is 2.1
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.504624041Z condor_shadow[641]: Request to run on <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> was ACCEPTED
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.504425631Z condor_startd[122]: slot1_2: Remote job ID is 2.2
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506699979Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506715079Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506719582Z condor_procd[120]: ProcAPI: read 9 pid entries out of 72 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506723050Z condor_procd[120]: process 191 (not in monitored family) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506726039Z condor_procd[120]: method PARENT: found family 122 for process 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506729167Z condor_procd[120]: method PARENT: found family 122 for process 265 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506732567Z condor_procd[120]: no methods have determined process 264 to be in a monitored family
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506736425Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506739844Z condor_procd[120]: moving process 265 into new subfamily 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506742998Z condor_procd[120]: new subfamily registered: root = 265, watcher = 122
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.506901621Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506349516Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506364511Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506369120Z condor_procd[120]: ProcAPI: read 10 pid entries out of 73 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506372982Z condor_procd[120]: method PARENT: found family 122 for process 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506375958Z condor_procd[120]: method PARENT: found family 122 for process 266 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506379463Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506383032Z condor_procd[120]: moving process 266 into new subfamily 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506386621Z condor_procd[120]: new subfamily registered: root = 266, watcher = 122
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506400019Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506760027Z condor_starter[265]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506766712Z condor_starter[265]: ** condor_starter (CONDOR_STARTER) STARTING UP
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506770633Z condor_starter[265]: ** /usr/sbin/condor_starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506774436Z condor_starter[265]: ** SubsystemInfo: name=STARTER type=STARTER(8) class=DAEMON(1)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506777658Z condor_starter[265]: ** Configuration: subsystem:STARTER local:slot_type_1 class:DAEMON
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506780984Z condor_starter[265]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506783937Z condor_starter[265]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506786595Z condor_starter[265]: ** PID = 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506789290Z condor_starter[265]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506792096Z condor_starter[265]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506794813Z condor_starter[265]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506798141Z condor_starter[265]: Using local config sources: 
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506801255Z condor_starter[265]:    /etc/condor/condor_config.local
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506804721Z condor_starter[265]: config Macros = 134, Sorted = 132, StringBytes = 4230, TablesBytes = 4872
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506807591Z condor_starter[265]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506810362Z condor_starter[265]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.506928840Z condor_starter[265]: SharedPortEndpoint: waiting for connections to named socket slot1_3_122_b5be_3
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.507909816Z condor_starter[265]: DaemonCore: command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_3_122_b5be_3>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.507922585Z condor_starter[265]: DaemonCore: private command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_3_122_b5be_3>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.921483150Z condor_startd[122]: slot1_2: Got universe "PARALLEL" (11) from request classad
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.921558663Z condor_startd[122]: slot1_2: State change: claim-activation protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.921566243Z condor_startd[122]: slot1_2: Changing activity: Idle -> Busy
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.921688513Z condor_startd[122]: slot1_1: Got activate_claim request from shadow (10.42.0.102)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.922603642Z condor_shadow[641]: Request to run on <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3> was ACCEPTED
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.922500705Z condor_startd[122]: slot1_1: Remote job ID is 2.3
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.924746620Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.924761028Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925029412Z condor_procd[120]: ProcAPI: read 11 pid entries out of 74 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925412582Z condor_procd[120]: method PARENT: found family 122 for process 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925423982Z condor_procd[120]: method PARENT: found family 122 for process 267 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925428698Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925432141Z condor_procd[120]: moving process 267 into new subfamily 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.925435499Z condor_procd[120]: new subfamily registered: root = 267, watcher = 122
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.927822246Z condor_shadow[641]: Request to run on <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3> was ACCEPTED
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.926707368Z condor_startd[122]: slot1_2: Got universe "PARALLEL" (11) from request classad
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.926723371Z condor_startd[122]: slot1_2: State change: claim-activation protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.926727781Z condor_startd[122]: slot1_2: Changing activity: Idle -> Busy
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.926731449Z condor_startd[122]: slot1_1: Got activate_claim request from shadow (10.42.0.102)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.927397285Z condor_startd[122]: slot1_1: Remote job ID is 2.4
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.930034711Z condor_shadow[641]: ERROR in ParallelShadow::updateFromStarter: no Node defined in update ad, can't process!
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926524749Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926552653Z condor_starter[266]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926558448Z condor_starter[266]: ** condor_starter (CONDOR_STARTER) STARTING UP
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926562363Z condor_starter[266]: ** /usr/sbin/condor_starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926565677Z condor_starter[266]: ** SubsystemInfo: name=STARTER type=STARTER(8) class=DAEMON(1)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926569270Z condor_starter[266]: ** Configuration: subsystem:STARTER local:slot_type_1 class:DAEMON
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926573382Z condor_starter[266]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926577312Z condor_starter[266]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926590664Z condor_starter[266]: ** PID = 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926595777Z condor_starter[266]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926645734Z condor_starter[266]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926662055Z condor_starter[266]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926666847Z condor_starter[266]: Using local config sources: 
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926674781Z condor_starter[266]:    /etc/condor/condor_config.local
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926758301Z condor_starter[266]: config Macros = 134, Sorted = 132, StringBytes = 4230, TablesBytes = 4872
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926763413Z condor_starter[266]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.926766826Z condor_starter[266]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.927074750Z condor_starter[266]: SharedPortEndpoint: waiting for connections to named socket slot1_2_122_b5be_4
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.927419185Z condor_starter[266]: DaemonCore: command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_2_122_b5be_4>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.927426572Z condor_starter[266]: DaemonCore: private command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_2_122_b5be_4>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.928557897Z condor_starter[265]: Communicating with shadow <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.928579863Z condor_starter[265]: Submitting machine is "10.42.0.102"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.929073377Z condor_starter[265]: setting the orig job name in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.929086395Z condor_starter[265]: setting the orig job iwd in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.930596198Z condor_starter[265]: Chirp config summary: IO false, Updates false, Delayed updates true.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.930607369Z condor_starter[265]: Initialized IO Proxy.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.930610926Z condor_starter[265]: Done setting resource limits
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.930613943Z condor_starter[265]: Job 2.0 set to execute immediately
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.934429697Z condor_shadow[641]: ERROR in ParallelShadow::updateFromStarter: no Node defined in update ad, can't process!
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931802224Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931821592Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931826459Z condor_procd[120]: ProcAPI: read 10 pid entries out of 73 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931830120Z condor_starter[265]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931833491Z condor_starter[265]: ** condor_starter (CONDOR_STARTER) STARTING UP
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931836698Z condor_starter[265]: ** /usr/sbin/condor_starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931840234Z condor_procd[120]: method PARENT: found family 122 for process 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931843458Z condor_starter[265]: ** SubsystemInfo: name=STARTER type=STARTER(8) class=DAEMON(1)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931846603Z condor_procd[120]: method PARENT: found family 122 for process 266 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931849765Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931852815Z condor_starter[265]: ** Configuration: subsystem:STARTER local:slot_type_1 class:DAEMON
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931864490Z condor_procd[120]: moving process 266 into new subfamily 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931868566Z condor_procd[120]: new subfamily registered: root = 266, watcher = 122
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931871889Z condor_starter[265]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931875210Z condor_starter[265]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931878313Z condor_starter[265]: ** PID = 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931881307Z condor_starter[265]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931884406Z condor_starter[265]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931887529Z condor_starter[265]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931890632Z condor_starter[265]: Using local config sources: 
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931893686Z condor_starter[265]:    /etc/condor/condor_config.local
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.931896829Z condor_starter[265]: config Macros = 134, Sorted = 132, StringBytes = 4230, TablesBytes = 4872
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.933512423Z condor_starter[265]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.933521622Z condor_starter[265]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.933524458Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.933526780Z condor_starter[265]: SharedPortEndpoint: waiting for connections to named socket slot1_2_122_b5be_3
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.934092264Z condor_starter[265]: DaemonCore: command socket at <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=slot1_2_122_b5be_3>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.934161543Z condor_starter[265]: DaemonCore: private command socket at <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=slot1_2_122_b5be_3>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938804091Z condor_starter[265]: Communicating with shadow <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938820950Z condor_starter[265]: Submitting machine is "10.42.0.102"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938825828Z condor_starter[265]: setting the orig job name in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938829251Z condor_starter[265]: setting the orig job iwd in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938832705Z condor_starter[265]: Chirp config summary: IO false, Updates false, Delayed updates true.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.938835884Z condor_starter[265]: Initialized IO Proxy.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943175236Z condor_starter[265]: Done setting resource limits
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943190455Z condor_starter[265]: Job 2.2 set to execute immediately
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943194358Z condor_starter[265]: Starting a PARALLEL universe job with ID: 2.2
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943197369Z condor_starter[265]: IWD: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943209405Z condor_starter[265]: Output file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/fb80970a6ae6497bbb4361a70c187e32.9648f1ea85bd410a89cbfa399fa8e616.stdout
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943213249Z condor_starter[265]: Error file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/fb80970a6ae6497bbb4361a70c187e32.9648f1ea85bd410a89cbfa399fa8e616.stderr
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943291718Z condor_starter[265]: Renice expr "0" evaluated to 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943298295Z condor_starter[265]: Running job as user user20001
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943301792Z condor_starter[265]: Using wrapper /etc/condor/condor_job_wrapper to exec /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/.start-f996b22433e14c64abbeee2fe5977b87.LINUX.sh.bat ./.start-block.LINUX.sh.bat .start-protoblock-9993a284a9e7411987858ed56ae49f6e-7.LINUX.sh.bat
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943305887Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943309128Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943312068Z condor_procd[120]: ProcAPI: read 11 pid entries out of 74 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943315171Z condor_procd[120]: method PARENT: found family 265 for process 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943318368Z condor_procd[120]: method PARENT: found family 265 for process 267 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943321344Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943324331Z condor_procd[120]: moving process 267 into new subfamily 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943327596Z condor_procd[120]: new subfamily registered: root = 267, watcher = 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:18.943330887Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:18.938861307Z condor_shadow[641]: ERROR in ParallelShadow::updateFromStarter: no Node defined in update ad, can't process!
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933000445Z condor_starter[266]: Communicating with shadow <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933020099Z condor_starter[266]: Submitting machine is "10.42.0.102"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933029875Z condor_starter[265]: Starting a PARALLEL universe job with ID: 2.0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933034078Z condor_starter[265]: IWD: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933037494Z condor_starter[266]: setting the orig job name in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933040524Z condor_starter[266]: setting the orig job iwd in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.933043230Z condor_starter[265]: Output file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/runner.stdout
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935196055Z condor_starter[265]: Error file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/runner.stderr
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935213341Z condor_starter[265]: Renice expr "0" evaluated to 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935218185Z condor_starter[265]: Running job as user user20001
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935222083Z condor_starter[265]: Using wrapper /etc/condor/condor_job_wrapper to exec /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/.start-f996b22433e14c64abbeee2fe5977b87.LINUX.sh.bat ./.start-runner.sh
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935226165Z condor_starter[266]: Chirp config summary: IO false, Updates false, Delayed updates true.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935229913Z condor_starter[266]: Initialized IO Proxy.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935233650Z condor_starter[266]: Done setting resource limits
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935237515Z condor_starter[266]: Job 2.1 set to execute immediately
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935241364Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935244956Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935249170Z condor_procd[120]: ProcAPI: read 12 pid entries out of 75 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935252888Z condor_starter[266]: Starting a PARALLEL universe job with ID: 2.1
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935256358Z condor_starter[266]: IWD: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935260227Z condor_procd[120]: method PARENT: found family 265 for process 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935272017Z condor_starter[266]: Output file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/1d1385dc702e4928baf3801fa6249b85.aaf943c1d776463982c0f325b64c7a80.stdout
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935276411Z condor_procd[120]: method PARENT: found family 265 for process 268 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935279837Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935283762Z condor_procd[120]: moving process 268 into new subfamily 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935287566Z condor_procd[120]: new subfamily registered: root = 268, watcher = 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.935291114Z condor_starter[266]: Error file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/1d1385dc702e4928baf3801fa6249b85.aaf943c1d776463982c0f325b64c7a80.stderr
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.937595013Z condor_starter[266]: Renice expr "0" evaluated to 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.937623039Z condor_starter[266]: Running job as user user20001
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.937629748Z condor_starter[266]: Using wrapper /etc/condor/condor_job_wrapper to exec /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/.start-f996b22433e14c64abbeee2fe5977b87.LINUX.sh.bat ./.start-block.LINUX.sh.bat .start-protoblock-9993a284a9e7411987858ed56ae49f6e-7.LINUX.sh.bat
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.937636480Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939638079Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939653075Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939657537Z condor_procd[120]: ProcAPI: read 13 pid entries out of 76 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939661111Z condor_procd[120]: method PARENT: found family 266 for process 269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939664037Z condor_procd[120]: method PARENT: found family 266 for process 269 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939666944Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939669885Z condor_procd[120]: moving process 269 into new subfamily 269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939672973Z condor_procd[120]: new subfamily registered: root = 269, watcher = 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:18.939676173Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.110280739Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.110792638Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.111760047Z condor_collector[50]: QueryWorker: forked new worker with id 339 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.112312652Z condor_collector[339]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.112800457Z condor_collector[339]: Query info: matched=9; skipped=0; query_time=0.000471; send_time=0.000452; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34989>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.127164234Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.133416387Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.133444846Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:19.133452484Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000047; send_time=0.000067; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36595>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:19.132137410Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:19.145651477Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.347139334Z condor_startd[122]: slot1_1: Got universe "PARALLEL" (11) from request classad
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.347249134Z condor_startd[122]: slot1_1: State change: claim-activation protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.347283401Z condor_startd[122]: slot1_1: Changing activity: Idle -> Busy
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.349260796Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.349288248Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.350170811Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.350189995Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.350196271Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.350234921Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352452843Z condor_starter[267]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352546610Z condor_starter[267]: ** condor_starter (CONDOR_STARTER) STARTING UP
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352584132Z condor_starter[267]: ** /usr/sbin/condor_starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352599763Z condor_starter[267]: ** SubsystemInfo: name=STARTER type=STARTER(8) class=DAEMON(1)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352606320Z condor_starter[267]: ** Configuration: subsystem:STARTER local:slot_type_1 class:DAEMON
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352611715Z condor_starter[267]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352617467Z condor_starter[267]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352787081Z condor_starter[267]: ** PID = 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.352797463Z condor_starter[267]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.354996192Z condor_starter[267]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355014198Z condor_starter[267]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355020368Z condor_starter[267]: Using local config sources: 
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355025982Z condor_starter[267]:    /etc/condor/condor_config.local
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355030921Z condor_starter[267]: config Macros = 134, Sorted = 132, StringBytes = 4230, TablesBytes = 4872
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355036425Z condor_starter[267]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355041386Z condor_starter[267]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355046557Z condor_starter[267]: SharedPortEndpoint: waiting for connections to named socket slot1_1_122_b5be_5
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355051971Z condor_starter[267]: DaemonCore: command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_1_122_b5be_5>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.355058874Z condor_starter[267]: DaemonCore: private command socket at <10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=slot1_1_122_b5be_5>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.356607801Z condor_starter[267]: Communicating with shadow <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.356628966Z condor_starter[267]: Submitting machine is "10.42.0.102"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.356922135Z condor_starter[267]: setting the orig job name in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.356960157Z condor_starter[267]: setting the orig job iwd in starter
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:19.363672014Z condor_shadow[641]: ERROR in ParallelShadow::updateFromStarter: no Node defined in update ad, can't process!
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363724711Z condor_starter[267]: Chirp config summary: IO false, Updates false, Delayed updates true.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363738066Z condor_starter[267]: Initialized IO Proxy.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363742643Z condor_starter[267]: Done setting resource limits
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363754206Z condor_starter[267]: Job 2.3 set to execute immediately
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363756828Z condor_starter[267]: Starting a PARALLEL universe job with ID: 2.3
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363759017Z condor_starter[267]: IWD: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363761284Z condor_starter[267]: Output file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/915d9debbdc74aa89aca0eff1a7ebf82.54941a123a83469e83a91ab086478aa3.stdout
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363773925Z condor_starter[267]: Error file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/915d9debbdc74aa89aca0eff1a7ebf82.54941a123a83469e83a91ab086478aa3.stderr
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363776428Z condor_starter[267]: Renice expr "0" evaluated to 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363778681Z condor_starter[267]: Running job as user user20001
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363780865Z condor_starter[267]: Using wrapper /etc/condor/condor_job_wrapper to exec /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/.start-f996b22433e14c64abbeee2fe5977b87.LINUX.sh.bat ./.start-block.LINUX.sh.bat .start-protoblock-9993a284a9e7411987858ed56ae49f6e-7.LINUX.sh.bat
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363783435Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363785524Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363787603Z condor_procd[120]: ProcAPI: read 14 pid entries out of 77 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363789773Z condor_procd[120]: method PARENT: found family 267 for process 270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363792981Z condor_procd[120]: method PARENT: found family 267 for process 270 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363796601Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363798741Z condor_procd[120]: moving process 270 into new subfamily 270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363801166Z condor_procd[120]: new subfamily registered: root = 270, watcher = 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.363803345Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.370565053Z condor_starter[265]: Create_Process succeeded, pid=268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.379422399Z condor_starter[265]: Create_Process succeeded, pid=267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.509366729Z condor_startd[122]: slot1_1: Got universe "PARALLEL" (11) from request classad
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.509503573Z condor_startd[122]: slot1_1: State change: claim-activation protocol successful
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.509549449Z condor_startd[122]: slot1_1: Changing activity: Idle -> Busy
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.511227545Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.511258009Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.511264338Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.511269589Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515308165Z condor_starter[266]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515337866Z condor_starter[266]: ** condor_starter (CONDOR_STARTER) STARTING UP
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515344402Z condor_starter[266]: ** /usr/sbin/condor_starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515349706Z condor_starter[266]: ** SubsystemInfo: name=STARTER type=STARTER(8) class=DAEMON(1)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515355190Z condor_starter[266]: ** Configuration: subsystem:STARTER local:slot_type_1 class:DAEMON
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515379777Z condor_starter[266]: ** $CondorVersion: 8.9.11 Dec 29 2020 BuildID: Debian-8.9.11-1.2 PackageID: 8.9.11-1.2 Debian-8.9.11-1.2 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515385648Z condor_starter[266]: ** $CondorPlatform: X86_64-Ubuntu_20.04 $
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515390753Z condor_starter[266]: ** PID = 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515395663Z condor_starter[266]: ** Log last touched time unavailable (Success)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515400853Z condor_starter[266]: ******************************************************
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515406337Z condor_starter[266]: Using config source: /etc/condor/condor_config
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515411439Z condor_starter[266]: Using local config sources: 
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515416594Z condor_starter[266]:    /etc/condor/condor_config.local
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515421474Z condor_starter[266]: config Macros = 134, Sorted = 132, StringBytes = 4230, TablesBytes = 4872
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515426783Z condor_starter[266]: CLASSAD_CACHING is OFF
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.515445835Z condor_starter[266]: Daemon Log is logging: D_ALWAYS D_ERROR D_STATS
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:19.520679694Z condor_shadow[641]: ERROR in ParallelShadow::updateFromStarter: no Node defined in update ad, can't process!
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.516823524Z condor_starter[266]: SharedPortEndpoint: waiting for connections to named socket slot1_1_122_b5be_4
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.516841726Z condor_starter[266]: DaemonCore: command socket at <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=slot1_1_122_b5be_4>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.516847498Z condor_starter[266]: DaemonCore: private command socket at <10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=slot1_1_122_b5be_4>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.519094825Z condor_starter[266]: Communicating with shadow <10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP&sock=shadow_128_effe_2>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.519116692Z condor_starter[266]: Submitting machine is "10.42.0.102"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.519698624Z condor_starter[266]: setting the orig job name in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.519724347Z condor_starter[266]: setting the orig job iwd in starter
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.521448816Z condor_starter[266]: Chirp config summary: IO false, Updates false, Delayed updates true.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.521473243Z condor_starter[266]: Initialized IO Proxy.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.521479583Z condor_starter[266]: Done setting resource limits
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.521484584Z condor_starter[266]: Job 2.4 set to execute immediately
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.523847952Z condor_starter[266]: Starting a PARALLEL universe job with ID: 2.4
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.523869864Z condor_starter[266]: IWD: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.523875485Z condor_starter[266]: Output file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/f7d1e97f96414dc3862ffa5b5a1fe4e3.1d93af4f5e2a4877ad3fe8644ff0a19b.stdout
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.523879637Z condor_starter[266]: Error file: /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/f7d1e97f96414dc3862ffa5b5a1fe4e3.1d93af4f5e2a4877ad3fe8644ff0a19b.stderr
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.532807064Z condor_starter[266]: Renice expr "0" evaluated to 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.532833339Z condor_starter[266]: Running job as user user20001
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.532885975Z condor_starter[266]: Using wrapper /etc/condor/condor_job_wrapper to exec /Users/1/occupy-cluster-4x500.p7wf/@Runs/#00000009.p7run/.p7run.tmp/.start-f996b22433e14c64abbeee2fe5977b87.LINUX.sh.bat ./.start-block.LINUX.sh.bat .start-protoblock-9993a284a9e7411987858ed56ae49f6e-7.LINUX.sh.bat
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535654583Z condor_procd[120]: PROC_FAMILY_REGISTER_SUBFAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535693927Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535701281Z condor_procd[120]: ProcAPI: read 13 pid entries out of 76 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535706836Z condor_procd[120]: method PARENT: found family 267 for process 281
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535712174Z condor_procd[120]: method PARENT: found family 266 for process 282
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535717479Z condor_procd[120]: method PARENT: found family 267 for process 281 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535723259Z condor_procd[120]: method PARENT: found family 266 for process 282 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535728652Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535733863Z condor_procd[120]: moving process 282 into new subfamily 282
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.535739072Z condor_procd[120]: new subfamily registered: root = 282, watcher = 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:19.536546701Z condor_procd[120]: PROC_FAMILY_TRACK_FAMILY_VIA_ENVIRONMENT
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.560412050Z condor_starter[266]: Create_Process succeeded, pid=269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:19.869845637Z condor_starter[267]: Create_Process succeeded, pid=270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:20.003824857Z condor_starter[266]: Create_Process succeeded, pid=282
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.149403856Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:20.171978847Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:20.172060880Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.150359029Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.155618291Z condor_collector[340]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.155973000Z condor_collector[50]: QueryWorker: forked new worker with id 340 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.155987278Z condor_collector[340]: Query info: matched=9; skipped=0; query_time=0.000385; send_time=0.000339; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:44803>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.191893860Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.191923878Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.191931406Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:20.191934895Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000030; send_time=0.000031; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:43045>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375179337Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375215279Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375220925Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375224338Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375227807Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:20.375230907Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:20.427225623Z condor_schedd[128]: Received a superuser command
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:20.427269657Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:20.427275987Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:20.525717092Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:20.525752569Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:20.531496565Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:20.531519951Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.224659620Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.228185526Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.228210915Z condor_collector[341]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.228263164Z condor_collector[50]: QueryWorker: forked new worker with id 341 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.228268895Z condor_collector[341]: Query info: matched=9; skipped=0; query_time=0.000321; send_time=0.000274; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:33943>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.241221730Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.241498504Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.242670082Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:21.242691325Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000036; send_time=0.000044; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36051>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:21.244408067Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:21.247208039Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.363042682Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.363074246Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.372268978Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.372299439Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.390180434Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:21.390212871Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:21.534922283Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:21.534952695Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:21.535845142Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:21.535869839Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.269735343Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.269994940Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.270634364Z condor_collector[342]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.270646756Z condor_collector[50]: QueryWorker: forked new worker with id 342 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.270892684Z condor_collector[342]: Query info: matched=9; skipped=0; query_time=0.000304; send_time=0.000276; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39659>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.278290620Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:22.279813156Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.278535037Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.279186270Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:22.279201729Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000044; send_time=0.000045; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34641>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:22.280308505Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385491158Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385530191Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385678316Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385691784Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385869262Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:22.385880807Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:22.537575346Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:22.537596932Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:22.537666879Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:22.537674494Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.290684857Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.291066331Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.291964053Z condor_collector[50]: QueryWorker: forked new worker with id 343 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.291985243Z condor_collector[343]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.292264528Z condor_collector[343]: Query info: matched=9; skipped=0; query_time=0.000411; send_time=0.000308; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:46533>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.301257618Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.302838668Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.302185624Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.302205575Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:23.302351393Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000037; send_time=0.000049; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36063>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.303375101Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.387751592Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.387776402Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.387927595Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.387942033Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.388084352Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:23.388096034Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:23.539950623Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:23.539989108Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:23.540146019Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:23.540159910Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.623649223Z condor_procd[126]: taking a snapshot...
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.623672794Z condor_procd[126]: ProcAPI: read 11 pid entries out of 74 total entries in /proc
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.623677300Z condor_procd[126]: ProcAPI: new boottime = 1621268011; old_boottime = 1621268011; /proc/stat boottime = 1621268011; /proc/uptime boottime = 1621268011
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.624396063Z condor_procd[126]: process 255 (not in monitored family) has exited
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.624408127Z condor_procd[126]: method PARENT: found family 128 for process 641
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.624418650Z condor_procd[126]: method PARENT: found family 128 for process 641 (already determined)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.624422468Z condor_procd[126]: no methods have determined process 531 to be in a monitored family
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:23.624425967Z condor_procd[126]: ...snapshot complete
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.051953193Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.052001614Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.052006453Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.052639088Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.053216487Z condor_collector[50]: QueryWorker: forked new high priority worker with id 344 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.053234411Z condor_collector[344]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.053573749Z condor_collector[344]: Query info: matched=9; skipped=0; query_time=0.000309; send_time=0.000300; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:46011>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.053738504Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.054910919Z condor_collector[50]: QueryWorker: forked new high priority worker with id 345 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.054925185Z condor_collector[345]: (Sending 11 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.055914246Z condor_collector[345]: Query info: matched=11; skipped=10; query_time=0.000241; send_time=0.001231; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:40649>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.057737717Z condor_negotiator[51]:   Sorting 11 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.057901004Z condor_negotiator[51]: Got ads: 11 public and 9 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.057990346Z condor_negotiator[51]: Public ads include 0 submitter, 9 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.057997617Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061835825Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061853604Z condor_negotiator[51]: Starting prefetch round; 0 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061859397Z condor_negotiator[51]: Prefetch summary: 0 attempted, 0 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061862714Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061874930Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.061890645Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:24.303447078Z condor_shared_port[127]: About to update statistics in shared_port daemon ad file at /var/lock/condor/shared_port_ad :
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:24.303469657Z condor_shared_port[127]: ForkedChildrenCurrent = 0#012ForkedChildrenPeak = 0#012MyAddress = "<10.42.0.102:34961?addrs=10.42.0.102-34961&alias=submit.pseven-htcondor&noUDP>"#012RequestsBlocked = 0#012RequestsFailed = 0#012RequestsPendingCurrent = 0#012RequestsPendingPeak = 2#012RequestsSucceeded = 65#012SharedPortCommandSinfuls = "<10.42.0.102:34961?alias=submit.pseven-htcondor>"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.311660938Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.311913915Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.313080984Z condor_collector[346]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.313116249Z condor_collector[50]: QueryWorker: forked new worker with id 346 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.313121305Z condor_collector[346]: Query info: matched=9; skipped=0; query_time=0.000303; send_time=0.000269; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41309>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.320354594Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:24.321755625Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.320583565Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.320621316Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:24.320669576Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000043; send_time=0.000051; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:45303>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:24.322223343Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390401225Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390423633Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390597320Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390637358Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390890935Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:24.390898949Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:24.542314781Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:24.542336478Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:24.542378742Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:24.542390835Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.331193550Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.331542882Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.332120479Z condor_collector[50]: QueryWorker: forked new worker with id 347 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.332137390Z condor_collector[347]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.332352228Z condor_collector[347]: Query info: matched=9; skipped=0; query_time=0.000339; send_time=0.000222; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34925>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.340026446Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:25.341403146Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.340240374Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.340260533Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:25.340377153Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000047; send_time=0.000042; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:33661>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:25.342194947Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393028640Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393047462Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393123612Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393134008Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393294707Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:25.393301922Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:25.543847818Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:25.543883886Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:25.543991530Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:25.544004540Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.319625326Z condor_procd[48]: taking a snapshot...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.319654419Z condor_procd[48]: ProcAPI: read 9 pid entries out of 72 total entries in /proc
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.319671191Z condor_procd[48]: ProcAPI: new boottime = 1621268011; old_boottime = 1621268011; /proc/stat boottime = 1621268011; /proc/uptime boottime = 1621268011
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.320133844Z condor_procd[48]: process 190 (not in monitored family) has exited
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.320301212Z condor_procd[48]: no methods have determined process 294 to be in a monitored family
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.320314485Z condor_procd[48]: ...snapshot complete
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.351180510Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.351436524Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.352085734Z condor_collector[348]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.352104308Z condor_collector[50]: QueryWorker: forked new worker with id 348 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.352412322Z condor_collector[348]: Query info: matched=9; skipped=0; query_time=0.000370; send_time=0.000272; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:35583>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.360842759Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:26.362440025Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.361480966Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.361496089Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:26.361501435Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000031; send_time=0.000071; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:32973>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:26.363246344Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.395575850Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.395649979Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.395712217Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.395722039Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.396723293Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:26.396755526Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:26.545272617Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:26.545298013Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:26.545423563Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:26.545439827Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.361389017Z condor_shared_port[49]: About to update statistics in shared_port daemon ad file at /var/lock/condor/shared_port_ad :
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.361427553Z condor_shared_port[49]: ForkedChildrenCurrent = 0#012ForkedChildrenPeak = 0#012MyAddress = "<10.42.0.115:19618?addrs=10.42.0.115-19618&alias=pseven-htcondormanager-deploy-7847df57d8-jc9fp.pseven-htcondor&noUDP>"#012RequestsBlocked = 0#012RequestsFailed = 0#012RequestsPendingCurrent = 0#012RequestsPendingPeak = 3#012RequestsSucceeded = 381#012SharedPortCommandSinfuls = "<10.42.0.115:19618?alias=pseven-htcondormanager-deploy-7847df57d8-jc9fp.pseven-htcondor>"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.373310201Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.373603002Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.374692832Z condor_collector[349]: (Sending 9 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.374715710Z condor_collector[50]: QueryWorker: forked new worker with id 349 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.374720773Z condor_collector[349]: Query info: matched=9; skipped=0; query_time=0.000361; send_time=0.000274; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:43899>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.382350264Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.384043407Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.382769040Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.382786324Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.383238626Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000055; send_time=0.000048; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39079>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.384733974Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.387815405Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.387837433Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.388439648Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.389365735Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.389385995Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.389390896Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.389394404Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.389409791Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.391760287Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.390075539Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.391780864Z condor_procd[120]: gathering usage data for family with root pid 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.390086485Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.392304385Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393472519Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393493676Z condor_procd[120]: gathering usage data for family with root pid 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393498705Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393502420Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393505635Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393508552Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393511220Z condor_procd[120]: gathering usage data for family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393513972Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.393516840Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.635229304Z condor_starter[266]: Process exited, pid=282, status=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.635261076Z condor_starter[266]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.637790506Z condor_starter[267]: Process exited, pid=270, status=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.637834202Z condor_starter[267]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.635729880Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.635747405Z condor_procd[120]: gathering usage data for family with root pid 282
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637376315Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637408800Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637412673Z condor_procd[120]: ProcAPI: read 12 pid entries out of 75 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637415179Z condor_procd[120]: process 282 (of family 282) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637429466Z condor_procd[120]: process 281 (of family 267) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637432246Z condor_procd[120]: method PARENT: found family 267 for process 285
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637435904Z condor_procd[120]: method PARENT: found family 267 for process 285 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637438882Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637441950Z condor_procd[120]: sending signal 9 to family with root 282
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637565366Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637573629Z condor_procd[120]: gathering usage data for family with root pid 282
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637761663Z condor_starter[265]: Process exited, pid=267, status=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637947481Z condor_starter[265]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637954110Z condor_starter[266]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637958131Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.637961387Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.638877690Z condor_starter[266]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.638274101Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.638285817Z condor_procd[120]: gathering usage data for family with root pid 270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.640133906Z condor_starter[266]: Process exited, pid=269, status=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.640156330Z condor_starter[266]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.640465102Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.640545873Z condor_procd[120]: gathering usage data for family with root pid 269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641246657Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641256284Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641258920Z condor_procd[120]: ProcAPI: read 10 pid entries out of 73 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641261259Z condor_procd[120]: process 285 (of family 267) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641263513Z condor_procd[120]: process 267 (of family 267) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641265598Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641268206Z condor_procd[120]: sending signal 9 to family with root 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641271488Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641274255Z condor_procd[120]: gathering usage data for family with root pid 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.641276489Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.643191010Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.643209101Z condor_starter[266]: All jobs have exited... starter exiting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.644702835Z condor_starter[265]: All jobs have exited... starter exiting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.644714899Z condor_starter[266]: **** condor_starter (condor_STARTER) pid 266 EXITING WITH STATUS 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.642611318Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643206695Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643222512Z condor_procd[120]: ProcAPI: read 15 pid entries out of 78 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643225670Z condor_procd[120]: ProcAPI: new boottime = 1621268011; old_boottime = 1621268011; /proc/stat boottime = 1621268011; /proc/uptime boottime = 1621268011
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643228654Z condor_procd[120]: process 270 (of family 270) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643231009Z condor_procd[120]: process 269 (of family 269) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643233212Z condor_procd[120]: method PARENT: found family 268 for process 279
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643235379Z condor_procd[120]: method PARENT: found family 268 for process 318
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643237622Z condor_procd[120]: method PARENT: found family 268 for process 320
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643249727Z condor_procd[120]: method PARENT: found family 268 for process 279 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643252498Z condor_procd[120]: method PARENT: found family 268 for process 318 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643254685Z condor_procd[120]: method PARENT: found family 268 for process 320 (already determined)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643256823Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643258886Z condor_procd[120]: sending signal 9 to family with root 269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643261017Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643263087Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643265153Z condor_procd[120]: ProcAPI: read 15 pid entries out of 78 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643814393Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643830202Z condor_procd[120]: sending signal 9 to family with root 270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643833978Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643837204Z condor_procd[120]: gathering usage data for family with root pid 269
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643840085Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643843207Z condor_procd[120]: gathering usage data for family with root pid 270
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643846616Z condor_starter[266]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.643961376Z condor_starter[267]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.644586084Z condor_starter[266]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.644600921Z condor_starter[267]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.645813207Z condor_startd[122]: Starter pid 266 exited with status 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.646808696Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.650648484Z condor_schedd[128]: Got RELEASE_CLAIM from <10.42.0.119:39105>
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.646735005Z condor_starter[266]: All jobs have exited... starter exiting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.647645896Z condor_starter[266]: **** condor_starter (condor_STARTER) pid 266 EXITING WITH STATUS 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.648987453Z condor_starter[267]: All jobs have exited... starter exiting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650131745Z condor_startd[122]: Starter pid 266 exited with status 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650147694Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650150775Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650153168Z condor_procd[120]: ProcAPI: read 15 pid entries out of 78 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650155472Z condor_starter[267]: **** condor_starter (condor_STARTER) pid 267 EXITING WITH STATUS 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650157645Z condor_procd[120]: ProcAPI::getProcInfo() pid 351 does not exist.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650159814Z condor_procd[120]: process 266 (of family 266) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650161953Z condor_procd[120]: watcher 266 of family with root 269 has died; family removed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650164156Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650173498Z condor_procd[120]: sending signal 9 to family with root 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650308939Z condor_startd[122]: slot1_2: State change: starter exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650314165Z condor_startd[122]: slot1_2: Changing activity: Busy -> Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650335127Z condor_startd[122]: slot1_2: State change: idle claim shutting down due to CLAIM_WORKLIFE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.650990119Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651001069Z condor_procd[120]: ProcAPI: read 10 pid entries out of 73 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651012347Z condor_procd[120]: ProcAPI::getProcInfo() pid 330 does not exist.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651016294Z condor_procd[120]: process 266 (of family 266) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651019285Z condor_procd[120]: watcher 266 of family with root 282 has died; family removed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651022206Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651025182Z condor_procd[120]: sending signal 9 to family with root 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651028417Z condor_startd[122]: slot1_1: State change: starter exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651031416Z condor_starter[265]: **** condor_starter (condor_STARTER) pid 265 EXITING WITH STATUS 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651034691Z condor_startd[122]: slot1_1: Changing activity: Busy -> Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651037840Z condor_startd[122]: slot1_1: State change: idle claim shutting down due to CLAIM_WORKLIFE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651041011Z condor_startd[122]: slot1_1: Changing state and activity: Claimed/Idle -> Preempting/Vacating
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651044178Z condor_startd[122]: slot1_1: State change: No preempting claim, returning to owner
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651420247Z condor_startd[122]: slot1_1: Changing state and activity: Preempting/Vacating -> Owner/Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651614307Z condor_startd[122]: slot1_1: State change: IS_OWNER is false
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651649713Z condor_startd[122]: slot1_1: Changing state: Owner -> Unclaimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.651726650Z condor_startd[122]: slot1_1: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.652680310Z condor_startd[122]: slot1_1: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.653000138Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.653016570Z condor_procd[120]: unregistering family with root pid 266
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.651152633Z condor_schedd[128]: Got RELEASE_CLAIM from <10.42.0.117:37373>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.655117377Z condor_collector[50]: Got INVALIDATE_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.655168689Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650337506Z condor_startd[122]: slot1_2: Changing state and activity: Claimed/Idle -> Preempting/Vacating
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.650943209Z condor_startd[122]: slot1_2: State change: No preempting claim, returning to owner
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.651420554Z condor_startd[122]: slot1_2: Changing state and activity: Preempting/Vacating -> Owner/Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.651662252Z condor_startd[122]: slot1_2: State change: IS_OWNER is false
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.651676482Z condor_startd[122]: slot1_2: Changing state: Owner -> Unclaimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.651833610Z condor_startd[122]: slot1_2: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.652204471Z condor_startd[122]: slot1_2: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.652217494Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.652222107Z condor_procd[120]: unregistering family with root pid 266
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.653786905Z condor_startd[122]: Starter pid 267 exited with status 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.653804729Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.653809758Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.655030080Z condor_procd[120]: ProcAPI: read 13 pid entries out of 76 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.655051678Z condor_procd[120]: process 267 (of family 267) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.655056304Z condor_procd[120]: watcher 267 of family with root 270 has died; family removed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.655060214Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.655063400Z condor_procd[120]: sending signal 9 to family with root 267
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.656459918Z condor_startd[122]: slot1_1: State change: starter exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.656478356Z condor_startd[122]: slot1_1: Changing activity: Busy -> Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.656630221Z condor_startd[122]: slot1_1: State change: idle claim shutting down due to CLAIM_WORKLIFE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.656640262Z condor_startd[122]: slot1_1: Changing state and activity: Claimed/Idle -> Preempting/Vacating
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.653856517Z condor_startd[122]: Starter pid 265 exited with status 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655146675Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655158327Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655162051Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655165467Z condor_procd[120]: process 265 (of family 265) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655168617Z condor_procd[120]: watcher 265 of family with root 267 has died; family removed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655171861Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655174643Z condor_procd[120]: sending signal 9 to family with root 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655319086Z condor_startd[122]: slot1_2: State change: starter exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.655327402Z condor_startd[122]: slot1_2: Changing activity: Busy -> Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657109305Z condor_startd[122]: slot1_2: State change: idle claim shutting down due to CLAIM_WORKLIFE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657119498Z condor_startd[122]: slot1_2: Changing state and activity: Claimed/Idle -> Preempting/Vacating
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657132703Z condor_startd[122]: slot1_2: State change: No preempting claim, returning to owner
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657136598Z condor_startd[122]: slot1_2: Changing state and activity: Preempting/Vacating -> Owner/Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657140073Z condor_startd[122]: slot1_2: State change: IS_OWNER is false
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657143014Z condor_startd[122]: slot1_2: Changing state: Owner -> Unclaimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657146360Z condor_startd[122]: slot1_2: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.657149741Z condor_startd[122]: slot1_2: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.658253764Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:27.658267199Z condor_procd[120]: unregistering family with root pid 265
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.656767972Z condor_schedd[128]: Got RELEASE_CLAIM from <10.42.0.119:43607>
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:27.657337062Z condor_schedd[128]: Got RELEASE_CLAIM from <10.42.0.117:42001>
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656817580Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656832642Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656838667Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656842476Z condor_collector[50]: Got INVALIDATE_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656845990Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656849586Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656852962Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.656857067Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.657038205Z condor_collector[50]: Got INVALIDATE_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.657049971Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.657119797Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.657128456Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.119 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.657132396Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658384795Z condor_startd[122]: slot1_1: State change: No preempting claim, returning to owner
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658459931Z condor_startd[122]: slot1_1: Changing state and activity: Preempting/Vacating -> Owner/Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658467117Z condor_startd[122]: slot1_1: State change: IS_OWNER is false
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658469687Z condor_startd[122]: slot1_1: Changing state: Owner -> Unclaimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658472227Z condor_startd[122]: slot1_1: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658480336Z condor_startd[122]: slot1_1: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658482722Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:27.658484820Z condor_procd[120]: unregistering family with root pid 267
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.661391102Z condor_collector[50]: Got INVALIDATE_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.661411644Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.661416038Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.661418273Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:27.661420683Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:28.367747329Z condor_shared_port[121]: About to update statistics in shared_port daemon ad file at /var/lock/condor/shared_port_ad :
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:28.367810803Z condor_shared_port[121]: ForkedChildrenCurrent = 0#012ForkedChildrenPeak = 0#012MyAddress = "<10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP>"#012RequestsBlocked = 0#012RequestsFailed = 0#012RequestsPendingCurrent = 0#012RequestsPendingPeak = 3#012RequestsSucceeded = 24#012SharedPortCommandSinfuls = "<10.42.0.117:38967?alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor>"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.410476509Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.411781096Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.414726545Z condor_collector[350]: (Sending 5 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.414775078Z condor_collector[50]: QueryWorker: forked new worker with id 350 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.414874946Z condor_collector[350]: Query info: matched=5; skipped=0; query_time=0.001042; send_time=0.001083; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:42667>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.435608580Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:28.438055826Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.436368220Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.436394706Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:28.436403551Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000095; send_time=0.000076; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:37617>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:28.439749213Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:28.532778097Z condor_shared_port[121]: About to update statistics in shared_port daemon ad file at /var/lock/condor/shared_port_ad :
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:28.532801134Z condor_shared_port[121]: ForkedChildrenCurrent = 0#012ForkedChildrenPeak = 0#012MyAddress = "<10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP>"#012RequestsBlocked = 0#012RequestsFailed = 0#012RequestsPendingCurrent = 0#012RequestsPendingPeak = 2#012RequestsSucceeded = 21#012SharedPortCommandSinfuls = "<10.42.0.119:43207?alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor>"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:28.660281869Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:28.660318272Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.458591517Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.459329977Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.460707372Z condor_collector[358]: (Sending 5 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.460742857Z condor_collector[50]: QueryWorker: forked new worker with id 358 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.461326311Z condor_collector[358]: Query info: matched=5; skipped=0; query_time=0.000687; send_time=0.000596; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39409>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.477321313Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:29.480778821Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.478110723Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.478142998Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:29.478324176Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000082; send_time=0.000156; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:42907>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:29.482461453Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:29.599725238Z condor_shared_port[122]: About to update statistics in shared_port daemon ad file at /var/lock/condor/shared_port_ad :
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x/htcondorexecute] 2021-06-15T16:59:29.599793086Z condor_shared_port[122]: ForkedChildrenCurrent = 0#012ForkedChildrenPeak = 0#012MyAddress = "<10.42.0.121:46787?addrs=10.42.0.121-46787&alias=pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x.pseven-htcondor&noUDP>"#012RequestsBlocked = 0#012RequestsFailed = 0#012RequestsPendingCurrent = 0#012RequestsPendingPeak = 1#012RequestsSucceeded = 13#012SharedPortCommandSinfuls = "<10.42.0.121:46787?alias=pseven-htcondorexecute-deploy-55f84d9fd4-2nx4x.pseven-htcondor>"
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.662452377Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.662475372Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.994889767Z condor_starter[265]: Process exited, pid=268, status=0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.994923256Z condor_starter[265]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.994928642Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.994932100Z condor_procd[120]: gathering usage data for family with root pid 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996103302Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996124251Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996128793Z condor_procd[120]: ProcAPI: read 10 pid entries out of 73 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996357451Z condor_procd[120]: process 318 (of family 268) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996369836Z condor_procd[120]: process 279 (of family 268) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996374982Z condor_procd[120]: process 268 (of family 268) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996466882Z condor_procd[120]: process 264 (not in monitored family) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:29.996621874Z condor_procd[120]: no methods have determined process 358 to be in a monitored family
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.020778967Z condor_shadow[641]: RemoteResource::killStarter(): Could not send command to startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.025014571Z condor_collector[50]: Got INVALIDATE_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.025044634Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.025050137Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.025053351Z condor_collector[50]: #011#011**** Removed(1) ad(s): "< slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx , 10.42.0.117 >"
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.025068541Z condor_collector[50]: (Invalidated 1 ads)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:30.020726098Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776069#11#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:30.020755874Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:30.020760755Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.119:43207?addrs=10.42.0.119-43207&alias=pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776069#6#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:30.020764653Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023193146Z condor_shadow[641]: RemoteResource::killStarter(): Could not send command to startd
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023227088Z condor_shadow[641]: RemoteResource::killStarter(): Could not send command to startd
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023232143Z condor_shadow[641]: RemoteResource::killStarter(): Could not send command to startd
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023235276Z condor_shadow[641]: Job 2.0 terminated: exited with status 0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023238190Z condor_shadow[641]: Reporting job exit reason 100 and attempting to fetch new job.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023241364Z condor_shadow[641]: **** condor_shadow (condor_SHADOW) pid 641 EXITING WITH STATUS 100
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023333840Z condor_schedd[128]: In DedicatedScheduler::reaper pid 641 has status 25600
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.023341981Z condor_schedd[128]: Shadow pid 641 exited with status 100
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.032965492Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::unwrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.032989912Z condor_schedd[128]: DedicatedScheduler::deallocMatchRec
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.032993870Z condor_schedd[128]: DedicatedScheduler::deallocMatchRec
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008647299Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008704765Z condor_procd[120]: sending signal 9 to family with root 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008710461Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008714160Z condor_procd[120]: gathering usage data for family with root pid 268
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008717810Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008721282Z condor_starter[265]: Failed to open '.update.ad' to read update ad: No such file or directory (2).
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008738064Z condor_procd[120]: PROC_FAMILY_GET_USAGE
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.008787998Z condor_procd[120]: gathering usage data for family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020879425Z condor_starter[265]: All jobs have exited... starter exiting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020890532Z condor_startd[122]: Failed to write ToE tag to .job.ad file (13): Permission denied
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020893755Z condor_startd[122]: slot1_3: Called deactivate_claim_forcibly()
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020896634Z condor_startd[122]: slot1_3: Changing state and activity: Claimed/Busy -> Preempting/Vacating
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020899614Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776068#11#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020903226Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020906141Z condor_startd[122]: Error: can't find resource with ClaimId (<10.42.0.117:38967?addrs=10.42.0.117-38967&alias=pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2.pseven-htcondor&noUDP&sock=startd_86_e1a3>#1623776068#6#...) -- perhaps this claim was already removed?
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020909594Z condor_startd[122]: Error: problem finding resource for 404 (DEACTIVATE_CLAIM_FORCIBLY)
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020912290Z condor_starter[265]: **** condor_starter (condor_STARTER) pid 265 EXITING WITH STATUS 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.020915290Z condor_startd[122]: Starter pid 265 exited with status 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030667803Z condor_procd[120]: PROC_FAMILY_KILL_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030693142Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030696880Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030700445Z condor_procd[120]: process 265 (of family 265) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030703744Z condor_procd[120]: process 320 (of family 268) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030706887Z condor_procd[120]: watcher 265 of family with root 268 has died; family removed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030710150Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030713394Z condor_procd[120]: sending signal 9 to family with root 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030716617Z condor_startd[122]: slot1_3: State change: starter exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030719894Z condor_startd[122]: slot1_3: State change: No preempting claim, returning to owner
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030723199Z condor_startd[122]: slot1_3: Changing state and activity: Preempting/Vacating -> Owner/Idle
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030726822Z condor_startd[122]: slot1_3: State change: IS_OWNER is false
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030730128Z condor_startd[122]: slot1_3: Changing state: Owner -> Unclaimed
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030733278Z condor_startd[122]: slot1_3: Changing state: Unclaimed -> Delete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030736755Z condor_startd[122]: slot1_3: Resource no longer needed, deleting
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030750673Z condor_procd[120]: PROC_FAMILY_UNREGISTER_FAMILY
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030753953Z condor_procd[120]: unregistering family with root pid 265
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030757088Z condor_startd[122]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030760071Z condor_startd[122]: Can't read ClaimId
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:30.030763120Z condor_startd[122]: Error: problem finding resource for 403 (DEACTIVATE_CLAIM)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.068736900Z condor_negotiator[51]: ---------- Started Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.069062215Z condor_negotiator[51]: Phase 1:  Obtaining ads from collector ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.069076690Z condor_negotiator[51]:   Getting startd private ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.069305606Z condor_collector[50]: Got QUERY_STARTD_PVT_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.070021257Z condor_collector[359]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.070262306Z condor_collector[50]: QueryWorker: forked new high priority worker with id 359 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.071423126Z condor_negotiator[51]:   Getting Scheduler, Submitter and Machine ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.071443244Z condor_collector[359]: Query info: matched=4; skipped=0; query_time=0.000885; send_time=0.000882; type=MachinePrivate; requirements={true}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:38393>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.072326795Z condor_collector[360]: (Sending 6 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075619320Z condor_collector[360]: Query info: matched=6; skipped=10; query_time=0.000350; send_time=0.000931; type=Any; requirements={(((MyType == "Submitter")) || ((MyType == "Machine")))}; locate=0; limit=0; from=COLLECTOR; peer=<10.42.0.115:33097>; projection={}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075645712Z condor_collector[50]: QueryWorker: forked new high priority worker with id 360 ( max 4 active 2 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075650127Z condor_negotiator[51]:   Sorting 6 ads ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075653488Z condor_negotiator[51]: Got ads: 6 public and 4 private
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075656750Z condor_negotiator[51]: Public ads include 0 submitter, 4 startd
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.075659901Z condor_negotiator[51]: Phase 2:  Performing accounting ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079256902Z condor_negotiator[51]: Phase 3:  Sorting submitter ads by priority ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079286803Z condor_negotiator[51]: Starting prefetch round; 0 potential prefetches to do.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079291952Z condor_negotiator[51]: Prefetch summary: 0 attempted, 0 successful.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079309685Z condor_negotiator[51]: Phase 4.1:  Negotiating with schedds ...
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079313928Z condor_negotiator[51]:  negotiateWithGroup resources used submitterAds length 0 
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.079317149Z condor_negotiator[51]: ---------- Finished Negotiation Cycle ----------
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.492090056Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.492337525Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.493037201Z condor_collector[361]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.493053182Z condor_collector[50]: QueryWorker: forked new worker with id 361 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.493261069Z condor_collector[361]: Query info: matched=4; skipped=0; query_time=0.000278; send_time=0.000263; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36381>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.500823428Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.502231705Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.501124553Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.501264087Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:30.501274355Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000038; send_time=0.000051; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39865>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:30.502690820Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.515867623Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.518227543Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.518251827Z condor_collector[362]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.518257444Z condor_collector[362]: Query info: matched=4; skipped=0; query_time=0.000348; send_time=0.000291; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:35743>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.518506892Z condor_collector[50]: QueryWorker: forked new worker with id 362 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.527014202Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:31.528740583Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.527413510Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.527446685Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:31.527477092Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000030; send_time=0.000048; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:41813>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:31.529490388Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.539656435Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.540316385Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.540898658Z condor_collector[363]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.541359838Z condor_collector[363]: Query info: matched=4; skipped=0; query_time=0.000392; send_time=0.000296; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:43319>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.544037583Z condor_collector[50]: QueryWorker: forked new worker with id 363 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.551564975Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.552112283Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.552135913Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:32.552140532Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000043; send_time=0.000040; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36239>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:32.553641964Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:32.554402978Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.563466132Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.563683562Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.564305388Z condor_collector[50]: QueryWorker: forked new worker with id 364 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.564312927Z condor_collector[364]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.564674717Z condor_collector[364]: Query info: matched=4; skipped=0; query_time=0.000291; send_time=0.000244; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:34505>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.572026390Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:33.573588952Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.572291221Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.572306580Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:33.573553116Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000029; send_time=0.000042; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:39489>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:33.574958227Z condor_schedd[128]: Number of Active Workers 0
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663462787Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663488051Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663529821Z condor_procd[120]: ProcAPI: new boottime = 1621268011; old_boottime = 1621268011; /proc/stat boottime = 1621268011; /proc/uptime boottime = 1621268011
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663885078Z condor_procd[120]: process 264 (not in monitored family) has exited
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663941332Z condor_procd[120]: no methods have determined process 337 to be in a monitored family
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-jbxx2/htcondorexecute] 2021-06-15T16:59:33.663952768Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:34.027517414Z condor_procd[120]: taking a snapshot...
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:34.027554926Z condor_procd[120]: ProcAPI: read 8 pid entries out of 71 total entries in /proc
[pod/pseven-htcondorexecute-deploy-55f84d9fd4-gs2z2/htcondorexecute] 2021-06-15T16:59:34.027861022Z condor_procd[120]: ...snapshot complete
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.583910788Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.584339587Z condor_collector[50]: Got QUERY_STARTD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.585217494Z condor_collector[365]: (Sending 4 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.585234082Z condor_collector[50]: QueryWorker: forked new worker with id 365 ( max 4 active 1 pending 0 )
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.585240986Z condor_collector[365]: Query info: matched=4; skipped=0; query_time=0.000294; send_time=0.000212; type=Machine; requirements={true}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36569>; projection={Activity Arch CondorLoadAvg EnteredCurrentActivity LastHeardFrom Machine Memory MyCurrentTime Name OpSys State}; filter_private_ads=0
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.593944121Z condor_collector[50]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:34.595689888Z condor_schedd[128]: ZKM: In Condor_Auth_Passwd::wrap.
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.594257409Z condor_collector[50]: Got QUERY_SCHEDD_ADS
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.594272854Z condor_collector[50]: (Sending 1 ads in response to query)
[pod/pseven-htcondormanager-deploy-7847df57d8-jc9fp/htcondormanager] 2021-06-15T16:59:34.594340322Z condor_collector[50]: Query info: matched=1; skipped=0; query_time=0.000023; send_time=0.000040; type=Scheduler; requirements={((TotalRunningJobs > 0 || TotalIdleJobs > 0 || TotalHeldJobs > 0 || TotalRemovedJobs > 0 || TotalJobAds > 0))}; locate=0; limit=0; from=TOOL; peer=<10.42.0.102:36351>; projection={ScheddIpAddr CondorVersion Name Machine}; filter_private_ads=0
[pod/pseven-htcondorsubmit-deploy-fcf987674-5w2vw/htcondorsubmit] 2021-06-15T16:59:34.596159014Z condor_schedd[128]: Number of Active Workers 0
^C

â??-INT

_____________________________________________________________________________________________________________________________________[16:59:35]
[coder@devenv: /pSeven/] [py:venv] [k8s:default] [1038-htcondor-improvements-v2 â??·47â??·36|â?? 5â?¦3â?? 60]
>>> 
Every 1.0s: condor_status; condor_q -all -global                         pseven-htcondorsubmit-deploy-fcf987674-5w2vw: Tue Jun 15 16:59:23 2021

Name                                                                   OpSys      Arch   State     Activity LoadAv Mem   ActvtyTime

slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx   LINUX      X86_64 Unclaimed Idle      0.000 1544  0+00:04:30
slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Idle      0.000  256  0+00:00:01
slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx   LINUX      X86_64 Unclaimed Idle      0.000  264  0+00:04:39
slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Busy      0.000  512  0+00:00:00
slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Busy      0.000  512  0+00:00:01
slot1_3@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Busy      0.000  512  0+00:00:01
slot1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx   LINUX      X86_64 Unclaimed Idle      0.000  776  0+00:04:35
slot1_1@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Busy      0.000  512  0+00:00:00
slot1_2@xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx LINUX      X86_64 Claimed   Busy      0.000  512  0+00:00:01

               Total Owner Claimed Unclaimed Matched Preempting Backfill  Drain

  X86_64/LINUX     9     0       6         3       0          0        0      0

         Total     9     0       6         3       0          0        0      0


-- Schedd: parallel_schedd@xxxxxxxxxxxxxxxxxxxxxx : <10.42.0.102:34961?... @ 06/15/21 16:59:23
OWNER     BATCH_NAME    SUBMITTED   DONE   RUN    IDLE   HOLD  TOTAL JOB_IDS
user20001 ID: 2        6/15 16:59      _      5      _      _      5 2.0-4

Total for query: 5 jobs; 0 completed, 0 removed, 0 idle, 5 running, 0 held, 0 suspended
Total for all users: 5 jobs; 0 completed, 0 removed, 0 idle, 5 running, 0 held, 0 suspended