Question

I'm using JDBC to get a large amount of data. The call completes successfully, but when resultSet.next() is called, I get the following error:

java.lang.OutOfMemoryError: allocLargeObjectOrArray - Object size: 15414016, Num elements: 7706998

I've attempted to increase the JVM memory size, but this does not fix the problem. I'm not sure this problem can even be addressed as I'm not using JDBC to access a database, rather, the system is accessing a BEA AquaLogic service through JDBC.

Has anyone run into this error?

Was it helpful?

Solution

Beware that until the first resultSet.next() call the results may not yet be read from the database or still be in another caching structure somewhere.

You should try limit your Select to return a sane amount of results and maybe repeat the call until there are no more results left if you need all the data.

Increasing the JVM memory size won't help unless you can be sure that there is an absolute limit on the amount of data which will be returned by your JDBC call.

Furthermore, accessing any service through JDBC essentially boils down to using JDBC :)

Another (unlikely) possibility could be that there is a bug in the JDBC driver you're using. Try a different implementation if it is possible and check if the problem persists.

OTHER TIPS

First-- figure out if you really need to get that much data in memory at once. RDBMS's are good at aggregating/sorting/etc large data sets, and you should try to take advantage of that if possible.

If not (and you really, really do need that much data in working memory for some reason)... and bumping up the JVM's memory args doesn't raise the bar enough... look into an in-memory distributed caching solution like Coherence (COTS) or TerraCotta (open source).

You can try setting the setFetchSize(int rows) method on your statement.
But setFetchRows is only a hint, which means it may not be implemented.

Try increasing the memory size to 1.2g e.g. -mx1200m or something just less than the physical memory of your machine. You may find it is reading more data at once than your think.

How many rows are you returning from the database? like kosi2801 I would suggest to only fetch a subset of the data, start with a reasonable number and then increase to find the threshold.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top