Question

I'm using serialization for "save" feature in my application. But when the data is too big (15+ MB) I'm starting to get OutOfMemory exceptions.

I've got so many objects and they are connected with other little objects, I think this is causing too much processing power and data held in the memory.

My code is based on this, almost same:

http://www.codeproject.com/KB/vb/TreeViewDataAccess.aspx

Edit :

  1. I don't use custom serialization, it's all done by [Serialization] attributes. Excluding some fields.

  2. I serialize so many objects and custom classes. Includes Dictionary, structures and bunch of other stuff.

  3. I serialize it into a file.

  4. I use XmlSerializer

P.S. I've got 4 GB physical memory.

Solution

Thanks to answers, my problem was found to be with XmlSerializer and I've got rid of it. Binary serialization is working just fine with the data I've got.

Was it helpful?

Solution

15MB shouldn't give you an OOM.

If the data is tree-like (rather than a full graph), you might consider a serializer like protobuf-net; as well as using Google's very efficient (both speed and memory) binary "protocol buffers" format, it benefits from not having to do reference tracking (required for graphs) - which means it only has to worry about data once (twice if it has to get buffered).

However, this requires different markup to your classes (or at least, an "opt in") - and it won't handle full graphs. But it is there, and free...

OTHER TIPS

I had exactly the same problem. The reason is that .NET serialization does not scale.

I solved the problem by using Simon Hewitt's excellent open source library, see Optimizing Serialization in .NET - part 2.

Besides dramatically reducing memory usage it is also much faster. Similar to the article I got a speed-up of 20 times.

Actually, XmlSerializer ignores the SerializableAttribute attributes. They're used only by the formatting classes (BinaryFormatter, SoapFormatter).

I wouldn't serialize using the XmlSerializer, and especially not a combination of XmlSerializer and BinaryFormatter.

I would simply try to serialize everything using the BinaryFormatter.

You could write your own serialization routines and see if you can gain any performance benefits by hand-tailoring your serialization process. For more details, see the MSDN page on Custom Serialization.

Maybe you can give us a little more detail about how the serialization is made. Do you use a custom serialization ? Or do you just use the built-in [Serialization] attribute ?

I think that a good way for you to handle this is to try to do your custom serialization logic and only serialize only what you need, it can't go to 4GB, anyway it also depends on how much memeory your application has assigned.

With all the approaches that are mentioned here, the ease of dumping big objects to disk and recovering is lost. Also these support only dump data types, So you can't dump reference types as easily as you could with BinaryFormatter.

Also doing compression using gzip or 7-Zip before binary formatting of large objects actually move the size above 16 MB to something like 32 MB.

You could Download JSON.NET Library which works in my project more than 100 MB data's serialization and deserialization.

For Serialization you can work like

If you have Object use TextWriter

using (TextWriter textWriter = File.CreateText("LocalJsonFile.json"))
{
    var serializer = new JsonSerializer();
    serializer.Serialize(textWriter , yourObject);
}

If you have string use StringWriter

  StringBuilder sb = new StringBuilder();
  StringWriter sw = new StringWriter(sb);

  using(JsonWriter textWriter = new JsonTextWriter(sw))
  {
     var serializer = new JsonSerializer();
     serializer.Serialize(textWriter, yourObject);
  }

This may work for you.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top