PyQt4 and Python 2.7

I have a GUI application which manages the processing of rather large datasets. These datasets/files are found, grouped into a collection of files to be handled together, then converted and processed. Each step along the way is broken out in the code as well. The idea is to be able to stop and start each of these 'steps' individually.

I've gotten the starting portion to work very well now, basically using queues to feed the information on down the line of converting/processing steps. The problem I'm having now is getting them to stop...

I have a very small example of what I'm trying to do. But essentially, I start a QThread which accepts a group of related items, the QThread then commands the worker which is a VERY LONG RUNNING process via multiprocessing.pool.map. Very long running meaning that the processing can take 20mins....but I want to be able to stop all processes in the pool immediately. I've used a while loop here, in the full code it's a call to outside exe using SubProcess.

Once the long running task is running inside the worker, I can't seem to find a way to forcefully kill it....despite the fact that PyCharm's 'stop' button is able to properly kill them. I'm not sharing ANY variables here, and if the items currently being 'worked' on go corrupt, I don't care because they'll get replaced the next time it runs (since it didn't finish the task completely).

How can I halt my worker?

from multiprocessing import Queue, Pool
from PyQt4.QtCore import *
import time
from itertools import repeat


#Worker that actually does the long work task
#Just printing in a while loop here
def worker(ID):
    while 1:
        print "Hello World from ", ID
        time.sleep(1)
    else:
        return

#QThread which manages the workers
#MyThread gets collection of tasks to perform and then sends work out to pool of workers
#Planning for at least 3 of these to be running simultaneously in full blown script
class MyThread(QThread):
    def __init__(self, inqueue, outqueue, id):
        super(MyThread, self).__init__()
        self.inqueue = inqueue
        self.outqueue = outqueue
        self.ID = id
        print 'initializedL: ', self.ID

    def run(self):

        while 1:
            print "Waiting"
            obj = self.inqueue.get(block=True)
            self.pool = Pool(processes=6)
            self.res = self.pool.map(worker, zip(obj, repeat(self.ID)))
            self.pool.close()
            self.pool.join()

    def stop(self):
        self.terminate()

if __name__ == "__main__":

    inqueue = Queue()
    outqueue = Queue()

    #start a new QThread which immediately waits for work to be assigned to it
    t = MyThread(inqueue, outqueue, 1)
    t.start()

    time.sleep(2)

    #Provide the QThread with a collection of items for the worker to process
    inqueue.put([1, 2, 3, 4, 5, 6, 7, 8])

    time.sleep(5)

    #At some point, I want to be able to completely dead stop all processes and threads
    #associated with MyThread...again will be 3 MyThreads in full version
    t.stop()

    db=2
    #SET DEBUG BREAKPOINT HERE TO SEE LOOP CONTINUE TO RUN
有帮助吗?

解决方案

In your stop method, add a call to self.pool.terminate. According to the documentation, the Pool.terminate function stops worker processes immediately.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top