Question

I am running into a use case where I would like to have control over how and when celery workers dequeue a task for processing from rabbitmq. Dequeuing will be synchronized with an external event that happens out of celery context, but my concern is whether celery gives me any flexibility to control dequeueing of tasks? I tried to investigate and below are a few possibilities:

  • Make use of basic.get instead of basic.consume, where basic.get is triggered based upon external event. However, I see celery defaults to basic.consume (push) semantics. Can I override this behavior without modifying the core directly?

  • Custom remote control the workers as and when the external event is triggered. However, from the docs it isn't very clear to me how remote control commands can help me to control dequeueing of the tasks.

I am very much inclined to continue using celery and possibly keep away from writing a custom queue processing solution on top of AMQP.

Was it helpful?

Solution

With remote control commands you can pause or resume message consumption from a given queue.

celery.control.cancel_consumer('celery')

the command above instructs all workers to stop consuming (dequeuing) messages from the default celery queue

celery.control.add_consumer('celery')

remote commands accept destination argument which allows to send a request to specific workers

OTHER TIPS

Two more exotic options to consider: (1) define a custom exchange type in the Rabbit layer. This allows you to create routing rules that control which tasks are sent to which queues. (2) define a custom Celery mediator. This allows you to controls which tasks move when from queues to worker pools.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top