Question

Following program always hangs on Mac OS (Python 2.7.5) if I return big enough string on Mac OS. I can't says for sure what is the limit, but it works for smaller text. It works fine on Ubuntu, but hangs on pipe_to_parent.send(result).

Does anybody know how to fix this? Is there anything wrong with the code bellow?

#!/usr/bin/python

import sys
from multiprocessing import Process, Pipe


def run(text, length):
    return (text * ((length / len(text))+1))[:length]

def proc_func(pipe_to_parent):
    result = {'status': 1, 'log': run('Hello World', 20000), 'details': {}, 'exception': ''}
    pipe_to_parent.send(result)
    sys.exit()


def call_run():
    to_child, to_self = Pipe()

    proc = Process(target=proc_func, args=(to_self,))
    proc.start()
    proc.join()
    print(to_child.recv())
    to_child.close()
    to_self.close()

call_run()
Was it helpful?

Solution

The documentation shows an example that has some differences, as follows:

from multiprocessing import Process, Pipe

def f(conn):
    conn.send([42, None, 'hello'])
    conn.close()

if __name__ == '__main__':
    parent_conn, child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()

    # This is the important part
    # Note: conn.recv() is called _before_ process.join()
    print parent_conn.recv()   # prints "[42, None, 'hello']"
    p.join()

In your example, you call .recv() after you've already called process.join().

...
proc = Process(target=proc_func, args=(to_self,))
proc.start()
proc.join()
print(to_child.recv())
...

To see exactly what is happening, we would have to look at the multiprocessing module code, but I'm guessing that the hanging is occurring because the pipe is attempting to begin a read from a closed end and blocking to wait for a response.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top