Question

I'm still fairly new to hibernate. I am uploading an sql script and auditing each statement in to a db. So, every statement will be saved as a string in to the database. however this file could contain up to 50,000+ statements. I've been reading up on hibernate batching, but i'm wondering what would be the best way to design and implement this.

So far, the file is uploading fine, i am creating a List out of each statement in the script, and then i save each object through hibernate individually. Obviously not great for performance!

I am wondering if i should still make a gigantic List of 50,000+ objects from the script - on controller side then pass it on to DAO, or should i parse through the file, say 100 rows at a time, and create a List of 100 objects, passing each list through to service->DAO.. and do so continuously until end of file.

How would the experts handle this design??

Thanks!

Was it helpful?

Solution

Take a look at spring-batch: with a job composed by 2 steps (file upload + data read/write) you'll solve your problem

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top