Question

I am currently working on a data management system that needs to calculate huge amounts of data: think of an excel sheet with about 150 million cells with data.

We use a sql server database to store the data but the calculation performance is sub-optimal due to many reasons. While considering alternatives, I thought giving a try to in-memory object structures.

Here are the basic requirements:

  1. We need a structure that supports multiple users making updates on the same "sheet" of my excel analogy. If performance of a single update is high enough, serializing updates of all users are acceptable

  2. We absolutely cannot accept data loss

  3. We need indexes on the object key, but also some indexes on partial keys, so the in-memory structure must support non-unique keys, resulting in efficient searches on partial keys returning a collection of matching records

  4. Must support .NET

Given these conditions, any suggestions please?

Thank you,

Kemal

Was it helpful?

Solution

What about a NoSQL database over a relational database? Something like MongoDB or RavenDB?

Mongo is an in-memory database and I believe Raven can be configured to run in-memory.

There's various flavours of NoSQL databases too. Some are geared for 'read-heavy' apps and some for 'write-heavy' apps.

You might also want to look at CQRS if you'd benefit from pre-calculating common searches or calculations.

I would expect the 'no data loss' requirement to be your biggest deciding factor though.

OTHER TIPS

Raima Inc is another No-SQL database that runs in-memory, on-disk or a hybrid of the two.

Regarding data loss, there are extensions available to provide mirroring and replication.

Here is a link of the complete technical specs of their database: RDM Embedded 10.1 Architecture and Features

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top