Question

I would like to performance testing a Rails app.

The real world data is 100 MB in size.

But Rails always rebuilds the test database, which overwrites the real world data.

So how to the performance testing?

Was it helpful?

Solution

I would create a new environment called "performance". You need this to replicate the production settings of your app (class caching, templates etc) and then load the database. In the past I have created a DB specifically for performance testing, created a rake task that executes the necessary migrations/loading and then called the rails performance script.

You can also turn the fixture behaviour off in your tests - depends on which test framework you are using.

I also found this useful post on Running Rails performance tests on real data that has some details on this approach.

OTHER TIPS

I have a quick fix for SQLite users.

In TestCase

def setup
  `cp db/development.sqlite3 db/test.sqlite3`
end

I would deploy the app to a staging server (which is close to your production environment). And generate data in your database for more accurate testing. You can take a look at ffaker gem for generating fake data. Then use a 3rd party tool to hit your application. Cause tools that you run on server itself will affect the performance as well. I prefer Jmeter as load testing tool. You can create test cases.

For example you want to test your login page. You can set login parameters and post to login url. You would consider doing tests for the pages which have write operations to your database. Those probably will be your app' bottleneck.

JMeter User Manual

Jmeter Tutorial

Hope this helps.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top