Question

As far as I know, ideally testing should be done extensively as hard and far as possible, however sometimes (always) we are required to ship quickly.

In case of a delayed or a over-deadline projects, is there a best practice in how much testing is enough? or is it subjective? Obviously no testing is planning to fail, but how little is too little?

Edit: this project is for internal use with certain device model.

Was it helpful?

Solution

For any piece of work you undertake, there'll likely be one of three constraints on it: time, features and quality. It is absolutely impossible to constrain on all three; at best you can choose two.

In your case, you have a time constraint imposed on you. So a choice needs to be made between features and quality. From your question and reply to Telastyn's comment, it sounds like you also have a feature constrain imposed on you too. In which case, quality has to suffer. So the answer to your question, "Is there such thing as minimum testing required", the answer is yes: that minimum is the amount of testing that can be achieved in the time available. Those setting the constraints have chosen - even if they were unaware of it - to ship with bugs.

This is not a good place to be though. For the sake of your customers, quality should be prioritised over features if there's a hard deadline. So the way to ensure that adequate testing occurs is to test early and often. Break the work down into feature chunks, implement a feature and have it properly tested, before starting on the next feature. That way, when you hit your deadline, you have a higher quality product ready to ship.

You can of course go one stage further: use test driven development (TDD). Start testing before you even write the code by writing a failing unit test, then writing code to make that test pass without breaking any other existing tests. You still need testers to conduct exploratory testing early and often too as unit tests will not catch all bugs, but it'll catch many of them. That way, quality is pushed to the fore. The question of what's the minimum level of testing then becomes moot: you are always doing way more than some arbitrary minimum.

OTHER TIPS

In case of a delayed or a over-deadline projects, is there a best practice in how much testing is enough?

Let's say for the sake of argument that the lower limit that you're talking about exists. How would you quantify it? Is it everybody knows that you must test for at least 6 hours, or is it there should be at least two tests for ever hundred lines of code?

Obviously, those measures don't make any sense because they're arbitrary. A decision about how much or how little testing is acceptable depends on a number of factors including:

  • the nature of the product being tested
  • the potential cost of shipping the product with bugs
  • the cost of delaying shipping to allow for more testing
  • the level of confidence that recent changes haven't introduced serious bugs

Those factors are different for each product, so it's impossible to say how much testing should be done without knowing about the product in question.

Licensed under: CC-BY-SA with attribution
scroll top