Question

I am using ipython parallel to schedule a large number of job using load_balanced_view. Each job uses subprocess.Popen to run a code and retrieve stdout and stderr. Then I want to save them into a log file.

This is an example of the structure of the code that I'm running:

import subprocess as sp

def make_inifile(c):
    ...
def run_exe(exe, inifile):
    def_Popen_kwargs = {'stdout': sp.PIPE, 'stderr': sp.PIPE, 
                        'universal_newlines': True}
    pexe = sp.Popen([exe, inifile], **def_Popen_kwargs)
    (stdout, stderr) = pexe.communicate()
    with open('logfile.log', 'a') as f:
        f.write(stdout)
        f.write(stderr)

rc = Client()
lbv = rc.load_balanced_view()
rc[:].execute("import subprocess as sp", block=True)

exe = "/path/to/exe"
for c in cases:
    inifile = make_inifile(c)
    lbv.apply(exe, inifile)

lbv.wait()

As I'll be using more than 1 processor, the log file will look like a mess, in the best case. A solution might be to lock the file, such that only a process at a time can write to it. This should be feasible, but look a bit overkill to me.

A better solution might be to open a log file for each engine, using the engine id in the file name. Something like this: "logfile_{}.log".format(id)

So the question is: is there a way to retrieve the engine id from within run_exe?

Was it helpful?

Solution

Either push client id to client at startup with a dview or use os.getpid() ?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top