Question

I have a Django application which uses django-celery, celery and rabbitmq for offline, distributed processing.

Now the setup is such that I need to run the celery tasks (and in turn celery workers) in other nodes in the network (different from where the Django web app is hosted).

To do that, as I understand I will need to place all my Django code in these separate servers. Not only that, I will have to install all the other python libraries which the Django apps require.

This way I will have to transfer all the django source code to all possible servers in the network, install dependencies and run some kind of an update system which will sync all the sources across nodes.

  1. Is this the right way of doing things? Is there a simpler way of making the celery workers run outside the web application server where the Django code is hosted ?

  2. If indeed there is no way other than to copy code and replicate in all servers, is there a way to copy only the source files which the celery task needs (which will include all models and views - not so small a task either)

Was it helpful?

Solution

For this type of situation I have in the past made a egg of all of my celery task code that I can simply rsync or copy in some fashion to my worker nodes. This way you can edit your celery code in a single project that can be used in your django and on your work nodes.

So in summary create a web-app-celery-tasks project and make it into an installable egg and have a web-app package that depends on the celery tasks egg.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top