Question

This is a use case in member enrollment via web application/web service. We have a complex algorithm for checking if a member is duplicate, by looking at multiple tables like phone,address etc. The algorithm varies based on member's country. So this restriction cannot be implemented using primary key/unique key constraint.

So we have the checks in Java code. But if there are 2 duplicate concurrent requests, the 2 Java threads see that the member doesn't exist and they both insert the record resulting in duplicates. How can I prevent such duplicate inserts?

I can prevent updates by using row level locks or Hibernate's optimistic concurrency. I can think of table level locks to prevent such inserts, but limits the application performance as it also blocks updates. Another option I think would be to create a lock table with a record with id='memberInsert', and force all inserts via JDBC to obtain a row level lock on this record.

Thanks Suneel

Was it helpful?

Solution

If it's going to be anywhere, I'd expect it to be in a write trigger, not in the Java code. Some other application or some other area of the application could do something badly.

Offloading this on the database gives you two advantages. 1) It prevents the race condition you mention up there and 2) It protects the integrity of the data by not allowing some errant application to modify records putting them in an illegal state.

OTHER TIPS

Can't you hash the outcome of the algorithm or something and simply use that as a unique primary key?

As long as the database is not aware of your requirements, it will not help you. And then you probably have no other choice than table level locking.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top