Question

This test would call a web service over and over for about a minute to confirm that a particular response code is returned, notifying us of our rate limit being reached. Not only is it a very slow test, it doesn't confirm much of anything, and wastes resources.

I feel that this is like confirming that chugging a bottle of vodka will kill a human. Do we really need a test for it?

Was it helpful?

Solution

Part of why we test is to verify our assumptions that the code is working as designed. If you don't care about the results that this test is returning, it's not useful and you should throw it away.

Conversely, if you do care, and the results of the test help you to develop the best software you can, keep it.

The point of testing is to help deliver high quality code. If a particular test activity gets you closer to that goal, do it. If it doesn't, don't.

OTHER TIPS

If it's not increasing your confidence in some aspect of the system, drop it. (Also, +1 for "confirming that chugging a bottle of vodka will kill a human.")

In this situation (assuming a 3rd party web service), I don't write the test that actually gets the service to return a rate-limited response. Instead, I have a test that verifies the expected behavior when receiving the response.

Save a rate-limiting response from the service, and pass that response back rather than calling the actual web service. Not only does the test run much faster, it can also run offline and will probably lead to a better code design.

In Java/Mockito-ish pseudo code it would look something like:

// pretend we are rate limited
when(client.callService()).thenReturn(RATE_LIMITED_RESPONSE);
// call the service
response = getResponse(client);
// decision based on rate limited response is correct
assertEquals(Status.RATE_LIMITED, response.getStatus());

where getResponse() calls client.callService() and subsequently handles all of the logic based on the content of the response.

If you've already got a test like this, then the integration test is really only testing that the web service returns a rate-limited response in a format that you expect.

My approach would be to make the rate limit configurable on the fly through some mechanism. And now your integration test can test both behaviors by lowering the rate substantially, and then exceeding it.

Configurable rate limits may or may not prove to be independently useful. But I think it is important to be able to sanely test that the safeguards you are depending on are actually there.

Licensed under: CC-BY-SA with attribution
scroll top