Question

I have the following function that I would like to run using multiprocessing:

def bruteForcePaths3(paths, availableNodes):

    results = []

    #start by taking each combination 2 at a time, then 3, etc
    for i in range(1,len(availableNodes)+1):
        print "combo number: %d" % i

        currentCombos = combinations(availableNodes, i)

        for combo in currentCombos:
            #get a fresh copy of paths for this combiniation
            currentPaths = list(paths)
            currentRemainingPaths = []
            # print combo

            for node in combo:
                #determine better way to remove nodes, for now- if it's in, we remove
                currentRemainingPaths = [path for path in currentPaths if not (node in path)]
                currentPaths = currentRemainingPaths

            #if there are no paths left
            if len(currentRemainingPaths) == 0:
                #save this combination
                print combo
                results.append(frozenset(combo))

    return results 

Bases on a few other post (Combining itertools and multiprocessing?), I tried to multiprocess this by the following:

def grouper_nofill(n, iterable):
        it=iter(iterable)
        def take():
            while 1: yield list(islice(it,n))
        return iter(take().next,[])

    def mp_bruteForcePaths(paths, availableNodes):

        pool = multiprocessing.Pool(4)
        chunksize=256
        async_results=[]


        def worker(paths,combos, out_q):
            """ The worker function, invoked in a process. 'nums' is a
                list of numbers to factor. The results are placed in
                a dictionary that's pushed to a queue.
            """
            results = bruteForcePaths2(paths, combos)
            print results
            out_q.put(results)


        for i in range(1,len(availableNodes)+1):
            currentCombos = combinations(availableNodes, i)
            for finput in grouper_nofill(chunksize,currentCombos):

                args = (paths, finput)
                async_results.extend(pool.map_async(bruteForcePaths2, args).get())

                print async_results

    def bruteForcePaths2(args):
        paths, combos = args
        results = []

        for combo in combos:
            #get a fresh copy of paths for this combiniation
            currentPaths = list(paths)
            currentRemainingPaths = []
            # print combo

            for node in combo:
                #determine better way to remove nodes, for now- if it's in, we remove
                currentRemainingPaths = [path for path in currentPaths if not (combo in path)]
                currentPaths = currentRemainingPaths

            #if there are no paths left
            if len(currentRemainingPaths) == 0:
                #save this combination
                print combo
                results.append(frozenset(combo))

        return results

I need to be able to pass in 2 arguments to the bruteforce function. I'm getting the error: "too many values to unpack"

So 3 part question: How can I multiprocess the bruteforce function over nproc cpu's splitting the combinations iterator? How can I pass in the two arguments- path and combinations? How do I get the result (think the mpa_async should do that for me)?

Thanks.

Was it helpful?

Solution

This

args = (paths, finput)
pool.map_async(bruteForcePaths2, args)

makes these two calls, which is not your intent:

bruteForcePaths2(paths)
bruteForcePaths2(finput)

You can use apply_async instead to submit single function calls to the pool. Note also that if you call get immediately it will wait for the result, and you don't get any advantage from multiprocessing.

You could do it like this:

for i in range(1,len(availableNodes)+1):
    currentCombos = combinations(availableNodes, i)
    for finput in grouper_nofill(chunksize,currentCombos):

        args = (paths, finput)
        async_results.append(pool.apply_async(bruteForcePaths2, args))

results = [x.get() for x in async_results]
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top