Question

I want to insert alot of rows into a table that generates the keys automatically. JDBC doesn't guarantee that getting the generated keys will work on a batch update.

It is implementation-defined as to whether getGeneratedKeys will return generated values after invoking the executeBatch method. (JDBC 4.0 Specification 2006, p. 135).

I have got two choices. Insert every row separately and get the key.Or do a batch insert and then query for the keys.

To avoid round-tripping for querying multiple keys I could use this approach here. Having e.g. 5 prepared statements with different sizes. And then breaking my "batch query" into the smallest possible statement.

What is the faster method?

Was it helpful?

Solution

It depends.

I have no numbers whether executeBatch() is actually faster than using a PreparedStatement repeatedly, so my suggestion is this:

  1. Try whether getGeneratedKeys() works for your JDBC driver. If it doesn't work, batched inserts won't work for you anyway.

  2. Write a test case for both and measure the performance. That gives you real data to base your decision on. Everything else is just guesswork (and wrong 90% of the time).

[EDIT] If you want to keep your app db agnostic, add the test to your app and run it on a temp/test table. If the test succeeds, chose an implementation that uses batched updates, otherwise fall back to brute force.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top