Question

If I am already unit-testing my javascript locally during development and before pushing changes up to a git repo are there any compelling reasons to do unit testing on a staging server before pushing the changes over to the live server?

It seems redundant.

Was it helpful?

Solution

The answer depends on how your tests are put together.

If they are pure unit tests (i.e. each test is exercising a single, isolated unit of code, with all dependencies mocked out) then there is little benefit to doing this, as the execution environment for each test should be identical both on your local development machine and on your staging server.

With proper unit tests, the only situations I can think of where you would catch issues on the staging server that were not found on your development machine are where a different operating system or Javascript interpreter are causing differences in behavior (however these types of issues should be quite rare). If you did find other reasons for unit tests to behave differently in these two environments (for example, as @Thilo mentions, because you have dirty code on your development machine, or because you depend on libraries that are on your development machine but not your staging server) then that indicates there is something wrong with your software development process which you need to address to make sure that you are setting up the environment your software runs in reliably.

However, if by unit tests you are talking about higher level automated tests (e.g. system tests that run through the browser) - which is a distinction that some people fail to make as they (incorrectly) refer to all automated tests as unit tests - then there is is likely some benefit to running these on the staging server. Often development and production setups will use different technologies and/or configurations for web servers and database servers, which can lead to differences in behavior which can only be picked up by testing on your staging server.

One final note, you should make sure that you do some form of high-level testing before pushing your changes live to production, as unit tests alone will not catch all of your problems. Ideally this would be a complete set of automated system-level acceptance tests that test all of the features of your software and exercise the whole software stack in an environment that matches production. However, at a minimum someone should be manually executing a set of tests across your key features on a staging server before your changes go live.

OTHER TIPS

Probably no benefit for just unit-testing (but if they are automated, then there is no real cost, either, and maybe they do catch something, such as inconsistent/incomplete deployments).

Also, the staging server is guaranteed to not contain any "dirty" code that your development machine might have (something you forgot to commit, some "unrelated" files, etc).

But there are other types of (more integrated) tests that you might want to do on the staging server.

Staging servers are beneficial if you have multiple testers looking at your code. Pushing a branch will give those testers a good idea of what they are looking at. This allows each tester to make sure the code is functioning as intended and get a glimpse at it before it hits the masses.

Sometimes the best way to test your code is to allow a bunch of people to try and break it. These people being the ones that are on your side.

And like @Jeff Ward said, it will not always mirror your machine. I have always learned that the more you test the less that can go wrong.

Running all unit tests and integration tests on your staging server is a great way to ensure that the last checkin did not disturb the code. Of course, programmers should not commit code that is not tested properly. However, we all sometimes forget that the code we write can effect other code, and thus forget to run all tests.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top