Question

As the time needed for run complete PHPUnit suite raises, our team starts wondering if there is a possibility to run Unit tests in parallel. Recently I read an article about Paraunit, also Sebastian Bergman wrote, he'll add parallelism into PHPUnit 3.7.

But there remains the problem with integration tests, or, more generally, tests that interact with DB. For the sake of consistency, the testDB has to be resetted and fixtures loaded after each test. But in parallel tests there is a problem with race conditions, because all processes use the same DB.

So to be able to run integration tests in parallel, we have to assign own database to each process. I would like to ask, if someone has some thoughts about how this problem can be solved. Maybe there are already implemented solutions to this problem in another xUnit implementation.

In my team we are using MongoDB, so one solution would be to programmatically create a config file for each PHPUnit process, with generated DB name(for this process), and in setUp() method we could clone the main TestDb into this temporary one. But before we start to implement this approach I would like to ask for your ideas about the topic.

Was it helpful?

Solution

This is a good question: preparing for parallel unit tests is going to require learning some new Best Practices, and I suspect some of them are going to slow our tests down.

At the highest level, the advice is: avoid testing with a database wherever possible. Abstract all interactions with your database, and then mock that class. But you've already noted your question is about integration tests, where this is not possible.

When using PDO, I generally use sqlite::memory: Each test gets its own database. It is anonymous and automatically cleaned up when the test ends. (But I noted some problems with this when you real application is not using sqlite: Suggestions to avoid DB deps when using an in-memory sqlite DB to speed up unit tests )

When using a database that does not have an in-memory choice, create the database with a random name. If the parallelization is at the PHPUnit process level, quite coarse, you could use the process pid. But that has no real advantages over a random name. (I know PHP is single-threaded, but perhaps in future we would have a custom phpUnit module, that uses threads to run tests in parallel; we might as well be ready for that.)

If you have the xUnit Test Patterns book, chapter 13 is about testing databases (relatively short). Chapters 8 and 9 on transient vs. persistent fixtures are useful too. And, of course, most of the book is on abstraction layers to make mocking easier :-)

OTHER TIPS

There is also this awesome library (fastest) for executing tests in parallel. It is optimized for functional/integration tests, giving an easy way to work with N databases in parallel.

Our old codebase run in 30 minutes, now in 7 minutes with 4 Processors.

Features

  • Functional tests could use a database per processor using the environment variable.
  • Tests are randomized by default.
  • Is not coupled with PhpUnit you could run any command.
  • Is developed in PHP with no dependencies.
  • As input you could use a phpunit.xml.dist file or use pipe.
  • Includes a Behat extension to easily pipe scenarios into fastest.
  • Increase Verbosity with -v option.

Usage

find tests/ -name "*Test.php" | ./bin/fastest "bin/phpunit -c app {};"

But there remains the problem with integration tests, or, more generally, tests that interact with DB. For the sake of consistency, the testDB has to be resetted and fixtures loaded after each test. But in parallel tests there is a problem with race conditions, because all processes use the same DB.

So to be able to run integration tests in parallel, we have to assign own database to each process. I would like to ask, if someone has some thoughts about how this problem can be solved. Maybe there are already implemented solutions to this problem in another xUnit implementation.

You can avoid integration test conflicts 2 ways:

  • running only those tests parallel, which uses very different tables of your database, so they don't conflict
  • create a new database for conflicting tests

Ofc. you can combine these 2 solutions. I don't know about any phpunit test runner which supports any of these approaches, so I think you have to write your own test runner to speed up the process... Btw you can still group your integration tests, and run only a few of them at once, if you are using them by development...

Be aware, that the same conflicts can cause concurrency issues under heavy loading in PHP. For example if you lock 2 files in reverse order under 2 separate controller action, then your application can end up in a deadlock... I am seeking a way to test concurrency issues in PHP, but no luck so far. I don't have time currently to write my own solution, and I am not sure I can manage it, it's pretty hard stuff... :S

In case that your application is coupled with a specific vendor eg. postgresql you can create separate stacks with docker and docker-compose. Then group together tests by purpoce eg. model tests, controller tests etc etc.

For each group deploy in your pipeline a specific stack using docker-compos and run the tests via docker. The idea is to have seperate environment with seperate databases hence you avoid the confict.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top