Question

I have a Ruby app hosted on Heroku that runs Anemone (Ruby web spider / crawler) on user-specified domains. When the user picks a medium-to-large sized domain, it crashes and the logs show an H12 error (Request timeout).

This is because Anemone takes a while to run (>30 seconds), and there is no feedback to the user while it's running. Is there a way I can get Anemone to show updates to the user, or a way to incorporate a status bar? Some way to prevent the site from crashing? I didn't see anything in the Anemone docs to allow a "piece by piece" way of spidering a domain, but there must be something I can do.

https://devcenter.heroku.com/articles/error-codes#h12-request-timeout
http://anemone.rubyforge.org/

Was it helpful?

Solution

Can you just run it in the background and send the user an alert when it is ready? I've used delayed_jobs and sidekiq for things like this in the past. Take a look at some of the background job gem.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top