Question

If your response in Sinatra returns an 'eachable' object, Sinatra's event loop will 'each' your result and yield the results in a streaming fashion as the HTTP response. However, if there are concurrent requests to Sinatra, it will iterate through all the elements of one response before handling another request. If we have a cursor to the results of some DB query, that means we have to wait for all the data to be available before handling a concurrent query.

I've looked at the async-sinatra gem and http://macournoyer.com/blog/2009/06/04/pusher-and-async-with-thin/, thinking these would solve my problem, but I've tried out this example:

require 'sinatra/async'

class AsyncTest < Sinatra::Base
  register Sinatra::Async

  aget '/' do
    body "hello async"
  end

  aget '/delay/:n' do |n|
    EM.add_timer(n.to_i) { body { "delayed for #{n} seconds" } }
  end
end

and the /delay/5 request doesn't work concurrently as I expect it to, i.e. I make 3 requests concurrently and Chrome's debugger notes the response times as roughly 5, 10, and 15 seconds.

Am I missing some setup or is there another way to tell Sinatra/Thin to handle requests in a concurrent manner?

Update: Here's another wrench in this (or possibly clears things up): Running curl -i http://localhost:3000/delay/5 concurrently has the correct behavior (2 requests each come back in ~5 seconds). Running ab -c 10 -n 50 http://locahost:3000/delay/5 (the Apache benchmark utility) also returns something reasonable for the total time (~25 seconds). Firefox exhibits the same behavior as Chrome. What are the browsers doing different from the command-line utilities?

Was it helpful?

Solution

So in the end, I found out that the example did indeed work and I could eventually get Sinatra to stream each-able results concurrently, primarily using the EM.defer idea in Pusher and Async page. curl and Apache benchmarking confirmed that this was working.

The reason why it didn't work in the browser is because browsers limit the number of connections to the same URL. I was aware of there being a limit to concurrent connections to a single domain (also a low number), but not that (seemingly) all connections to a single URI are serialized:

http://maillist.caucho.com/pipermail/resin-interest/2009-August/003998.html

I don't know if this is configurable, I only see domain-wide configuration in Firefox, but that was the issue.

OTHER TIPS

When you are about to handle the response of the object do this:

fork do
  handle request...
  exit 99
end

and if you don't need to do not wait for this child process to end.. with a:

child = fork do
  handle request...
  exit 99
end

Process.detach(child)

It's a simple way to handle multiple requests, however I am not sure what ORM you may be using for those DB queries, but you could get into table/row level locking problems with multiple processes trying to hit the db, if that is what you mean when you say handling requests...

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top