Question

I'm thinking about using an embedded db4o database for storing calculation results of a simulation tool. The simulation results can be quite large (up to some GBs for a single run, so the practical size of the database might probably be in the range from 10 GB to 100 GB).

As far as I've understood, db4o stores the whole database in a single file. While I've understood that db4o supports databases up to 254 GB, I'm still worrying that I might run into problems with file systems that dislike large files (The application I'm working on will run on a big variety of architectures, so I cannot really predict which file systems will be in place...). So, is there any best practice that helps me avoiding huge files while still keeping the benefits of an embedded database?

Edit: I just found this post dealing with (really) large amounts of data in db4o in general. However, it does not go into details concerning how they achieve this without stressing the file system...

Was it helpful?

Solution

The way to go on very large amount of data without create too large file is to use multiples containers. You should cut your data model into multiples containers (aka files).

Each file will contains some part of your model. Then by querying the good container, you should be able to retrieve yours objects; The joins (if need) must be done by hands.

If you have so few class in your model that it don't make sense to cut it, then I'm not sure db4o is way you need. Maybe a simple file with very tiny serializations should best fit yours needs.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top