I would like to build a set of cluster workers (ie droplets in digitalocean or so).
Each worker would be doing a periodic task and sending the results periodically to the main application.
Here is a pseudocode to demontrate the functionality:
Worker code
while True:
resultFromLocationXY = calculate_my_local_task()
send_task_to_the_main_application(resultFromLocationXY)
time.sleep(5)
Main appplication code
In the main application I would like to asynchronously evaluate worker results:
while True:
resultFromLocationXY = listen_to_results_from_location('xy') # not blocking, if new results are available, update the results variable
process_the_results([resultFromLocationXY, resultFromLocationXX, ...])
time.sleep(5)
I have been using the ipython ipcluster solution. I was able to create a remote worker and to execute a task by using an apply_async function AND arrange it all in a non blocking way.
BUT: I was not able to make periodic tasks, kind of streaming tasks. Moreover, I would like to have more nodes in one location and stream to the same variable into the main application.
I would prefer a non ipython solution.