Pregunta

I have a Rails 3 app that needs to generate an image and send the data to the browser.

The app must be deployed on Heroku.

However, Heroku only supports streaming through Mongrel which holds on to the memory. This then causes Heroku to slow, then kill the thread after a dozen or so requests.

https://devcenter.heroku.com/articles/error-codes#r14-memory-quota-exceeded

I am currently using send_data or send_file from ActionController::DataStreaming

http://api.rubyonrails.org/classes/ActionController/DataStreaming.html#method-i-send_data

Heroku does not support Rack::Sendfile or x-sendfile.

https://devcenter.heroku.com/articles/rack-sendfile

The project "ruby-mongrel-x-sendfile" says: "Streaming very much data through mongrel is a bad thing; springs stringy memory leaks" and provides an "in-mongrel solution". But it doesn't look like a good solution.

http://code.google.com/p/ruby-mongrel-x-sendfile/

A slow solution to this is to upload every file to Amazon S3 first.

Does anyone have any ideas please?

¿Fue útil?

Solución

The answer is to start garbage collection with:

GC.start

I placed that line at the bottom of the Rails controller action after send_data.

http://www.ruby-doc.org/core-1.9.3/GC.html

Otros consejos

The answer is absolutely not to start garbage collection. That masks the poor implementation. Your Ruby process will still consume more memory that is - strictly speaking - necessary.

The answer is to stream the response data - i.e. read the data chunk-by-chunk into memory and flush it through the response body. This way the maximum memory requirement to serve the file/data you are sending is limited to the size of the "page" being streamed.

Check out ActionController::Live and read the binary data out in chunks to the client requesting these images.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top