Say I have an application that needs to pull data from an API, but that there is a limit to how often I can send a query (i.e., caps at X requests / minute). In order to ensure I don't hit this limit, I want to add requests to a queue, and have a background job that will pull X requests and execute it every minute. I'm not sure what's the best method for this in Rails, however. From what I gather, DelayedJob is the better library for my needs, but I don't see any support for only running X jobs a minute. Does anyone know if there is a preferred way of implementing functionality like this?

有帮助吗?

解决方案 3

Maybe you can try [whenever]: https://github.com/javan/whenever

Then you can add your tasks as bellow:

every 3.hours do
  runner "MyModel.some_process"
  rake "my:rake:task"
  command "/usr/bin/my_great_command"
end

in a schedule.rb file

其他提示

I'm a little late but I would like to warn against using the whenever gem in your situation: Since you're using Ruby on Rails, using the whenever gem will be loading the environment each time it gets called in cron.

Give rufus-scheduler a try.

Place the code below, for example, in config/initializers/cron_stuff.rb

require 'rufus/scheduler'

scheduler = Rufus::Scheduler.start_new

scheduler.every '20m' do
  puts 'hello'
end

First, I would recommend using Sidekiq for processing background jobs. It's well supported and very simple to use. If you do use Sidekiq, then there is another gem, called Sidetiq, that will allow you to run recurring jobs.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top