Question

How do I get a thread to return a tuple or any value of my choice back to the parent in Python?

Was it helpful?

Solution

I suggest you instantiate a Queue.Queue before starting the thread, and pass it as one of the thread's args: before the thread finishes, it .puts the result on the queue it received as an argument. The parent can .get or .get_nowait it at will.

Queues are generally the best way to arrange thread synchronization and communication in Python: they're intrinsically thread-safe, message-passing vehicles -- the best way to organize multitasking in general!-)

OTHER TIPS

If you were calling join() to wait for the thread to complete, you could simply attach the result to the Thread instance itself and then retrieve it from the main thread after the join() returns.

On the other hand, you don't tell us how you intend to discover that the thread is done and that the result is available. If you already have a way of doing that, it will probably point you (and us, if you were to tell us) to the best way of getting the results out.

You should pass a Queue instance as a parameter then you should .put() your return object into the queue. You can gather the return value via queue.get() whatever object you put.

Sample:

queue = Queue.Queue()
thread_ = threading.Thread(
                target=target_method,
                name="Thread1",
                args=[params, queue],
                )
thread_.start()
thread_.join()
queue.get()

def target_method(self, params, queue):
 """
 Some operations right here
 """
 your_return = "Whatever your object is"
 queue.put(your_return)

Use for multiple threads:

#Start all threads in thread pool
    for thread in pool:
        thread.start()
        response = queue.get()
        thread_results.append(response)

#Kill all threads
    for thread in pool:
        thread.join()

I use this implementation and it works great for me. I wish you do so.

Use lambda to wrap your target thread function and pass its return value back to the parent thread using a queue. (Your original target function remains unchanged without extra queue parameter.)

Sample code:

import threading
import queue
def dosomething(param):
    return param * 2
que = queue.Queue()
thr = threading.Thread(target = lambda q, arg : q.put(dosomething(arg)), args = (que, 2))
thr.start()
thr.join()
while not que.empty():
    print(que.get())

Output:

4

I'm surprised nobody mentioned that you could just pass it a mutable:

>>> thread_return={'success': False}
>>> from threading import Thread
>>> def task(thread_return):
...  thread_return['success'] = True
... 
>>> Thread(target=task, args=(thread_return,)).start()
>>> thread_return
{'success': True}

perhaps this has major issues of which I'm unaware.

Another approach is to pass a callback function to the thread. This gives a simple, safe and flexible way to return a value to the parent, anytime from the new thread.

# A sample implementation

import threading
import time

class MyThread(threading.Thread):
    def __init__(self, cb):
        threading.Thread.__init__(self)
        self.callback = cb

    def run(self):
        for i in range(10):
            self.callback(i)
            time.sleep(1)


# test

import sys

def count(x):
    print x
    sys.stdout.flush()

t = MyThread(count)
t.start()

You can use synchronised queue module.
Consider you need to check a user infos from database with a known id:

def check_infos(user_id, queue):
    result = send_data(user_id)
    queue.put(result)

Now you can get your data like this:

import queue, threading
queued_request = queue.Queue()
check_infos_thread = threading.Thread(target=check_infos, args=(user_id, queued_request))
check_infos_thread.start()
final_result = queued_request.get()

POC:

import random
import threading

class myThread( threading.Thread ):
    def __init__( self, arr ):
        threading.Thread.__init__( self )
        self.arr = arr
        self.ret = None

    def run( self ):
        self.myJob( self.arr )

    def join( self ):
        threading.Thread.join( self )
        return self.ret

    def myJob( self, arr ):
        self.ret = sorted( self.arr )
        return

#Call the main method if run from the command line.
if __name__ == '__main__':
    N = 100

    arr = [ random.randint( 0, 100 ) for x in range( N ) ]
    th = myThread( arr )
    th.start( )
    sortedArr = th.join( )

    print "arr2: ", sortedArr

Well, in the Python threading module, there are condition objects that are associated to locks. One method acquire() will return whatever value is returned from the underlying method. For more information: Python Condition Objects

Based on jcomeau_ictx's suggestion. The simplest one I came across. Requirement here was to get exit status staus from three different processes running on the server and trigger another script if all three are successful. This seems to be working fine

  class myThread(threading.Thread):
        def __init__(self,threadID,pipePath,resDict):
            threading.Thread.__init__(self)
            self.threadID=threadID
            self.pipePath=pipePath
            self.resDict=resDict

        def run(self):
            print "Starting thread %s " % (self.threadID)
            if not os.path.exists(self.pipePath):
            os.mkfifo(self.pipePath)
            pipe_fd = os.open(self.pipePath, os.O_RDWR | os.O_NONBLOCK )
           with os.fdopen(pipe_fd) as pipe:
                while True:
                  try:
                     message =  pipe.read()
                     if message:
                        print "Received: '%s'" % message
                        self.resDict['success']=message
                        break
                     except:
                        pass

    tResSer={'success':'0'}
    tResWeb={'success':'0'}
    tResUisvc={'success':'0'}


    threads = []

    pipePathSer='/tmp/path1'
    pipePathWeb='/tmp/path2'
    pipePathUisvc='/tmp/path3'

    th1=myThread(1,pipePathSer,tResSer)
    th2=myThread(2,pipePathWeb,tResWeb)
    th3=myThread(3,pipePathUisvc,tResUisvc)

    th1.start()
    th2.start()
    th3.start()

    threads.append(th1)
    threads.append(th2)
    threads.append(th3)

    for t in threads:
        print t.join()

    print "Res: tResSer %s tResWeb %s tResUisvc %s" % (tResSer,tResWeb,tResUisvc)
    # The above statement prints updated values which can then be further processed

The following wrapper function will wrap an existing function and return an object which points both to the thread (so that you can call start(),join(), etc. on it) as well as access/view its eventual return value.

def threadwrap(func,args,kwargs):
   class res(object): result=None
   def inner(*args,**kwargs): 
     res.result=func(*args,**kwargs)
   import threading
   t = threading.Thread(target=inner,args=args,kwargs=kwargs)
   res.thread=t
   return res

def myFun(v,debug=False):
  import time
  if debug: print "Debug mode ON"
  time.sleep(5)
  return v*2

x=threadwrap(myFun,[11],{"debug":True})
x.thread.start()
x.thread.join()
print x.result

It looks OK, and the threading.Thread class seems to be easily extended(*) with this kind of functionality, so I'm wondering why it isn't already there. Is there a flaw with the above method?

(*) Note that husanu's answer for this question does exactly this, subclassing threading.Thread resulting in a version where join() gives the return value.

For easy programs the above answeres look a little bit like overkill to me. I would en-nicen the mutable approach:

class RetVal:
 def __init__(self):
   self.result = None


def threadfunc(retVal):
  retVal.result = "your return value"

retVal = RetVal()
thread = Thread(target = threadfunc, args = (retVal))

thread.start()
thread.join()
print(retVal.result)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top