Question

Should performance tests be instrumented by build tools as part of the build process, or should they live completely outside? If outside, then where in the deployment pipeline should they live?

I currently see performance tests being attached to integration testing lifecycle phases. While it's fine for now, I know this doesn't seem quite right, and I'm having difficulty finding an actual answer as to where we should be attaching and running these performance tests.

For the sake of the question, we can assume an environment utilizing Jenkins for CI, and Maven as a build tool. We can also assume a scrum SDLC.

Was it helpful?

Solution

There are typically two reasons for worrying about performance/load testing, either the team has trouble writing simple algorithms or specific performance requirements are part of an SLA. The third reason is that members of your team just like to worry about performance, aka Premature Optimization.

Performance tests should never be part of your unit testing. Unit tests are, by definition, for testing the correctness of the unit; Bubble Sort and Quick Sort are both valid implementations for a sorting requirement. Even if you have a performance requirement, that requirement will not be on the unit but on a vertical slice of the system. Assuming the bottleneck to be this unit is assuming too much. And if your unit tests take too long to run (ideally under one second, never more than 10) they become useless; developers start running them less frequently, not at all, or stop writing tests. Which leaves two options.

Make your performance tests part of your deployment pipeline. Performance tests are run after unit and integration tests as part of every commit (or whatever action triggers your pipeline). They will be run on your built artifact. Go this route if a SLA demands certain performance characteristics or if you're having trouble wrangling in your team's code quality ("Every time Jimmy makes a commit it brings the servers to their knees under heavy load and we spend days debugging the issue!").

Run your performance tests at night. Performance tests are scheduled to run at a certain time of the day. This is good for when your performance/load tests take a long time. Maybe you went the former route but these tests are becoming a bottleneck to releasing as frequently as you'd like. Now that you've gotten in the habit of running them more frequently maybe your team has gotten better at writing simple algorithms. Maybe now you can break your load tests apart into "tests that are absolutely required before we deploy to production" and "tests that keep us in good shape but problems with them don't surface that much".

It could be, as Doc Brown mentioned, that your performance tests take a week to run. But that's such an edge case I'll leave you to figure that one out when you get there. Report back here with your findings.

Running performance tests as part of your integration test phase doesn't feel right. Performance and load tests are generally used to track down problems with a vertical slice of your application so that you then have an indication of where to dive in. There wouldn't be much value to mocking out services as performance is a fickle beast. These services will respond differently under different loads. However, there could certainly be some 3rd party endpoint where they couldn't provide a sandbox for you. Maybe you can take production response time and stub them out appropriately. That response time could even be automated as new data comes in everyday.

There won't be a one size fits all and I suspect as you move forward you will come across a novel solution.

OTHER TIPS

Ask yourself: what kind of performance tests do you have, and how often do you need them to be run?

  • Tests for the "daily use"? Then make them part of your unit tests.

  • Just in the QA cycle of every release/deployment? Keep them as part of your integration tests (assumed those tests are run for every release/deployment).

  • Tests for an isolated part of your application, just needed once for optimizing purposes and then never again? Then keep them out of your CI cycle.

Running time of those tests will probably be a factor. If your performance tests need less than 20 minutes, you won't have much trouble to integrate them into your "nightly builds" on your CI server. If they need a week for a complete run, you obviously need a different strategy.

And to my understanding, using an agile model like "Scrum" means to adapt your process to the software you are building, not vice versa. So no decision of "where to place your performance tests" may be final. If you have tests at the beginning which are very fast, and become slower the more you expand them over time, you may have to relocate them in your process, split them up, change them to your needs etc.

Licensed under: CC-BY-SA with attribution
scroll top