[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [HTCondor-users] Priority calculation: memory



On 04/09/2015 08:05, Mathieu Bahin wrote:
Actually, we wonder how things will be with the partitionable slots.
>From what we understand:
   - a default max memory is allocated to the job if nothing special is
specified
http://research.cs.wisc.edu/htcondor/manual/current/condor_submit.html (JOB_DEFAULT_REQUESTMEMORY)
https://www-auth.cs.wisc.edu/lists/htcondor-users/2012-May/msg00160.shtml

   - if the job exceed this memory, the job is aborted
As far as I know there's no hard limiting of jobs to their requested memory, unless you explicitly set it up using a wrapper, or cgroups with hard limits. See:
http://research.cs.wisc.edu/htcondor/manual/latest/3_12Setting_Up.html#SECTION0041213000000000000000
http://research.cs.wisc.edu/htcondor/manual/latest/3_12Setting_Up.html#SECTION0041214000000000000000