Question

I've recently been investigating SpecFlow and can see the value of it for writing client/stateholder specs which can show a feature's progress however I'm nervous of introducing them to a client...

Imagine a feature:

Given I'm on the "MyAccount" page
   And I enter "Liath" into the "Name" textbox
When I click "Save"
Then my username should be "Liath"

We write these features, show the client a set of red icons and set the developers to work.

If our developer completes the happy path then the icon will go to green and the client will see that feature is complete.

However our developer knows that they've still got to consider things like:

  1. Checking the username to see if it's in use
  2. Checking the username for profanity
  3. Handle errors from the database
  4. Some form of validation to ensure the username uses only valid characters

Clearly if a client sees a row of green lights they will be reluctant to continue paying for the rest of the development, it will look like we're trying to stretch out the work.

What is the value of these tests as a progress indicator if they indicate work is completed long before it is? What techniques would we need to use to give a more accurate example of progress?

Was it helpful?

Solution

The scenario above could be one of many around the account update story.

In order to keep my account up to date,
As a registered user
I should be able to update my account

So you would multiple scenarios covering each of the potential outcomes for a story. A story could also be one of many stories for a feature.

What is the value of these tests as a progress indicator if they indicate work is completed long before it is?

One of the values tests like this bring is to show how far through a feature the development is. Only once all the tests are green is this feature functionally ready to be shipped.

This also a good point to add that the acceptance tests [AT] will be part of a suite of unit tests. Following the red, green, refactor method of TDD, the AT will be written first and fail. Then the unit tests for the code are written. Once the unit tests are passing the AT will pass because the code for that scenario has been completed.

What techniques would we need to use to give a more accurate example of progress?

Out of the box Specflow has the @ignore tag which you can put on your scenarios, eg

@ignore
Scenario: Updating my username
    Given I'm on the "MyAccount" page
    And I enter "Liath" into the "Name" textbox
    When I click "Save"
    Then my username should be "Liath"

This is useful as it gives the developer the ability to have all your stories and scenarios written but not executing, so when you show the tests to the client they will see a mix of green and yellow, the yellow being the scenarios that have not been started/completed.

I hope that covers it.

Rich

OTHER TIPS

You have to come to an agreement on what "done" means for everything you deliver. You can give them a signed letter of agreement, a light, or a smoke-signal and if they think it means something else, you have a problem. Being concerned with client misconceptions is universal in this business, so the only way to get around it, is to be constantly communicating and educating.

You can deliver all the green lights you want, but when your client looks at the app, they'll either like it or they won't. Hopefully, they'll tell you what they want to change and even better if they pay you for it instead of blaming your for misunderstanding them.

Many developers like showing clients UI sketches instead of working prototypes because they fear the client will think you're almost done and will be reluctant to approve additional billable hours.

Licensed under: CC-BY-SA with attribution
scroll top