Question

I am protecting duplicate item creation with a Redis lock. (I know I can do it in many other ways but it is a simpler testcase of a more complicated issue I have, and I would like to understand why redis/python-redis is failing)

def redisTester():
    for i in range(300):
        for j in range(300):
            lockKey = "foo_%d_%d" % (i,j)
            lock = redis.Redis().lock(lockKey, timeout=60, sleep=1)

            lock.acquire()
            try:
                bf = Foo.objects.get(a=i, b=j)
            except Foo.DoesNotExist:
                bf = Foo(a=i, b=j)
                bf.save()

            lock.release()

with this model :

class Foo(models.Model):
    a = models.IntegerField(db_index=True)
    b = models.IntegerField(db_index=True)

I launch two instances of this script in the command line. And somehow, some records are duplicated:

SELECT COUNT(*) , a, b FROM `redistest_foo` GROUP BY `a`, `b` HAVING COUNT(*) > 1;
+----------+-----+-----+
| COUNT(*) | a   | b   |
+----------+-----+-----+
|        2 |   2 | 184 |
|        2 |   5 |  92 |
|        2 |   8 |   3 |
|        2 |  10 | 219 |
|        2 |  13 | 127 |
|        2 |  16 |   7 |
|        2 |  18 | 196 |
|        2 |  21 |  85 |
|        2 |  23 | 288 |
...

It probably means that the lock does not work correctly, or the orm's save is not blocking.

What am I missing? Is this related to the way django orm is working?

Était-ce utile?

La solution

It turns out that it is related to mysql transaction management.

The default transaction isolation level of innodb "repeatable read", so the save will not be seen by others immediately. One should put the level to "read committed"

Python's MySqlDB not getting updated row

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top