Pergunta

I'm aware of using the subprocess module to isolate functions that might segfault. This works:

import subprocess
# Blocking for simplicity
res = subprocess.check_output(["python", "c_library_wrapper.py", arg0, arg1, ...])

What I'm trying to figure out is why multiprocessing doesn't have the same effect. This doesn't seem to work:

import multiprocessing
from c_library_wrapper import f
# Assume that f puts the return value into a shared queue
p = multiprocessing.Process(target=f, args=(arg0, arg1, ...))
p.start()
p.join()

Isn't this also creating an independent process? Is there a core concept I'm missing here?

Background: I'm isolating a large third-party C library to protect against segfaults. I know that the best way to handle segfaults is to fix them, but this library is really big.

Foi útil?

Solução

You mention you are using a shared queue. Heed the warning in the docs of Process.terminate (termination via segfault is a similar situation):

Warning: If this method is used when the associated process is using a pipe or queue then the pipe or queue is liable to become corrupted and may become unusable by other process. Similarly, if the process has acquired a lock or semaphore etc. then terminating it is liable to cause other processes to deadlock.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top