Question

I am using "manage.py test" along with a JSON fixture I created using using 'dumpdata'

My problem is that several of the tables in the fixture are very large (for example one containing the names of all cities in the US) which makes running a test incredibly slow.

Seeing as several of these tables are never modified by the program (eg - the city names will never need to be modified), it doesn't make much sense to create and tear down these tables for every test run.

Is there a better way to be testing this code using this kind of data?

Was it helpful?

Solution

This was my solution:

class xxxx(TestCase):
    def setUp(self):
        import _mysql
        db=_mysql.connect('xxxx', 'xxxx', 'xxxx', "test_xxxxxxx")
        db.query(open('sql/xxxxxx.sql').read())

The sql file was a sequence of insert statements that I exported using phpMyAdmin. Reading sql statements is much faster than importing a JSON or YAML fixture. This is surely not the most elegant solution, but it worked.

According to the third answer in Loading SQL dump before running Django tests, you just need to drop this sql file in the 'sql' directory inside the app directory. This worked for me for the production database when doing 'manage.py syncdb', but for some reason this data wasn't actually imported into the test database when doing 'manage.py test', even though the line 'Installing custom SQL for xxxx.xxxx model' appeared in the output. So, I wrote my own code inside setUp()

OTHER TIPS

You should check out the nose framework. Looks like you have more control on when you load test fixtures and when you tear-up :

"nose supports fixtures at the package, module, class, and test case level, so expensive initialization can be done as infrequently as possible. See Fixtures for more."

Furthermore, looks like there are django plugins for nose : on google

Hope it will help.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top