質問

At one point in an RoR application I'm building, an admin may upload a (relatively small) file that needs to then be processed (which takes a relatively long time). As this app is running on Heroku, I'd like to handle the processing on a worker dyno as to not tie up the web dyno. We're using the delayed_job gem for queuing.

Wondering what the best way to do this would be - a couple general solutions I've considered:

  • Have the file uploaded to S3 and have the worker grab it from S3. Cons: I'd rather not add an S3 dependency for various reasons, and we're not keeping the files around after processing anyways.

  • Read the file to a string, which then gets put in the DB by delayed_job when passed as an argument to the delayed method. The processing job can then convert it back into a the needed format.

Are there any downsides to this second solution? It seems like it could work, but not sure if it's ideal or not. Any other recommended ways to do things?

役に立ちましたか?

解決

You have two options: a file store like S3 or a database.

Storing files on S3 is inexpensive and reliable. There's a built in protocol for passing URLs to the file between your front-end web processes and back-end workers. I strongly recommend this.

You can also store the file contents in a database like Postgres or Redis, which both your front-end web processes and back-end dynos have access to. Databases are generalized engines and usually more expensive places to store files. This will also work, but I don't recommend it.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top