Pregunta

So I already looked around a lot for this but couldn't find a good answer. I'm using Celery 3.1.7 and Django 1.5.1., without django-celery package since newer versions of Celery don't require it anymore. I managed to set up tasks and execute them using RabbitMQ. Everything is working as it should there. However, I am integrating this in a existing, quite large, Django project. There we specified couple of Django settings files, not just one. We run different one depending on environment, for instance one for local machines and one for server. My problem is that I can't seem to be able to track down which settings file is "active" from the celery worker, which runs celery.py file in my project root (as documentation specifies). There the documentation requires to specify Django settings file like this:

os.environ.setdefault('DJANGO_SETTINGS_MODULE', "project.settings.server")

Now this works, but if I move the stuff locally I need to change it to settings.local to make it work, and that every time. Reading settings object in runtime like I do in standard Django files didn't work since celery worker executes in a different process. So, using this situation, does anyone have any idea on how to dynamically fetch active Django settings file from celery worker? Or perhaps pass it in as a variable when starting celery worker? (like for Django, etc --settings=project.settings.local) Thanks!

¿Fue útil?

Solución 2

One idea could be to have Django load the correct settings file every time and let celery use the one Django is using. That's how I'm doing it.

Say you have the project structure:

project/
    proj/
        settings.py
        urls.py
        ...

Replace with

project/
    proj/
        settings/
            __init__.py
            local.py
            production.py
            common_settings.py
            ...
        urls.py
        ...

Let common_settings.py be all settings shared between all environments and let your init.py load whatever config should be used.

# __init__.py:

from common_settings import *

# A check for environment variable, hostname, etc:
# Example for checking hostname:
from platform import node
if node() in ['dev1', 'dev2']:
    from local import *
elif node() in ['prod1', 'prod2']:
    from production import *

Now you can always rely on project.proj.settings being the right settings file for your environment.

Otros consejos

When initializing the celery worker on the command line, just set the environment variable prior to the celery command.

DJANGO_SETTINGS_MODULE='proj.settings' celery -A proj worker -l info

setdefault() returns the value of the variable in the system environment. It will return "project.settings.server" only if DJANGO_SETTINGS_MODULE is not defined.

So, I would leave the most frequently used settings module in there and change it when needed by explicitly declaring the environment variable:

i.e. for local development, in your virtualenv hooks, in your .bashrc, manually, etc:

export DJANGO_SETTINGS_MODULE=project.settings.local.

found this buried in the celery docs:

# project/app/tasks.py
app = Celery()
app.config_from_object('django.conf:settings')

http://celery.readthedocs.org/en/latest/whatsnew-3.1.html#django-supported-out-of-the-box

This will simply pull celery settings from the django settings file, which is usually what you want for environment separation.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top