Pergunta

We've got a Rails app where certain requests trigger long-running tasks on a worker dyno with delayed_job, while the front-end polls until it receives a result.

Our traffic is small but growing, and the tasks generally take only a few seconds to complete, but can take up to a minute. Right now a single web and worker dyno each should be sufficient to handle our load.

The problem is, the delayed_job queue won't process jobs in parallel, so a longer task ends up holding up the tasks behind it.

What we're looking for is something like Unicorn for the backend, where a single worker dyno can process multiple tasks concurrently. And since it's Rails, we're looking for something multi-process, not multi-threaded.

(We tried creating multiple worker entries in our procfile- This worked on the local dev box, but not on Heroku)

Procfile:

web: bundle exec unicorn -p $PORT -c ./config/unicorn.rb
# Multiple workers defined here doesn't translate into multiple processes on single worker dyno:
worker: bundle exec rake jobs:work
worker: bundle exec rake jobs:work
worker: bundle exec rake jobs:work

This Heroku article proposes using the resque-pool gem as a solution. Is that the only solution, or can delayed_job be used for parallel background jobs as well?

Foi útil?

Solução

According to @radiospiel's related post, you can use foreman to start multiple processes.

1) Add foreman to your Gemfile

2) Create two files:

Procfile:

web: bundle exec unicorn -p $PORT -c ./config/unicorn.rb
worker: bundle exec foreman start -f Procfile.workers

Procfile.workers:

dj_worker: bundle exec rake jobs:work
dj_worker: bundle exec rake jobs:work
dj_worker: bundle exec rake jobs:work

I just deployed this to Heroku, works great.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top