Question

I have recently installed Nginx + Thin on my deployment server, but i am not sure how this will perform in last requests & responses situation. lets say 1000/req per sec.

so the speed on thin is good with 10-100 req /per sec

I wanted to know on higher volumes of data being processed on the request/response cluster.

Guide me on this :-)

Was it helpful?

Solution

If you have a single server I think that the main key is, apart from everything already mentioned, is don't skimp on the specs of it. Trying to get too much to run on too little is just a recipe for disaster.

It is also a good idea to get monit or God monitoring your thin instances, I started out with God, but it leaked memory pretty bad on Ruby 1.8.6 so I stop using it in favour of monit. Monit is written in C I believe and has a tiny memory footprint so I'd recommend that one.

If all that seems like a bit much to keep nginx and thin playing nicely you may want to look into an all in one solution like Passenger or LiteSpeed. I have very little experience with these so can offer no substancial advice for them.

OTHER TIPS

Multiple thin processes and nginx are capable of providing lots of speed, depending on what your application is doing. So, the problem will be your application code, the speed of your application server, and your database server.

Scaling Rails has been recently covered in depth by the Scaling Rails Screencasts. I recommend you start there. My 5 step program to scaling Rails would be:

  1. First step is to have the tools to look at what is slow in your application. Do not spend time optimizing everything in your application when you don't know what the problem is.
  2. The easiest way to be able to handle lots of requests/second is with page caching.
  3. If you can't do that, cache everything possible (fragment caching, use memcached to cache data, etc), to speed up your application.
  4. After that, optimize your application as best as possible, make SQL queries fast, index everything, etc.
  5. If you still need more speed, throw more hardware at the problem. Get a big, powerful database server, a bunch of app servers, and proxy your requests across them. You can start here, too, but it will only delay the optimization process.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top