Question

My Python process on MacOS is totally ignoring the rlimits below which I set. (I confirmed by print that they have been set)

Physical memory usage goes above 2.4Gb, at which point CPU usage falls << 5% and it hangs. (The underlying culprit seems to be numpy array allocation, but I can't force an OutOfMemory exception to pinpoint where, so I'm stuck). What's going on? I thought hard rlimits cannot be ignored?

EDIT: it occurs to me that the offending allocation is not being done in native Python, but in either numpy or scikit-learn. Link to source of cpython:Modules/resource.c If so, that would be a definite docbug.

import resource
resource.setrlimit(resource.RLIMIT_AS,    (1*1048576L, 1.2*1048576L))
resource.setrlimit(resource.RLIMIT_DATA,  (1*1048576L, 1.1*1048576L))
resource.setrlimit(resource.RLIMIT_STACK, (100000, 120000))

Versions: python2.7.3 (macports, 64b) on MacOS 10.8.2, py27-numpy.

Machine has 4Gb physical RAM.

Was it helpful?

Solution

It occurs to me that the offending allocation is not being done in native Python, but in either numpy or scikit-learn. My allocation code may be leaking memory to temporaries. Will investigate. May need to look at the gc.

If this is the case, that sounds like a docbug on resource.

Link to source of cpython:Modules/resource.c

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top