Question

Ok, we serealizing/deserializing something, but when you working with people important data, you need to be shure that objects was serialized/deserialized correctly on 100%. For example let take the case when your application die/been forcibly closed during serialization.. in the most of cases you will get corrupted object, that is not accepteble for me, i undestant that i can manualy do baackup of file and wrote flag, lke uncommited before starting serializing and add flag commited at the end of proccess and take file from backup if it is uncommited during deserialization. That is not good because i have to make backup of very big file after every change. Is there some build in faile safe logic in the popular serializers in the .NET Framework or protobuf maybe? i can't find any info abaut faile safe serialization. Or can you tell me abaut some good pattern how to make serialization faile safe?

Was it helpful?

Solution

Serialization has nothing to do with this. Your actual question is: how do I write a bunch of data atomically to disk? Here are a few options:

  1. If the data is small enough to fit into a single disk cluster (512 bytes or 4K) use FILE_FLAG_WRITE_THROUGH | FILE_FLAG_NO_BUFFERING to atomically write a single disk sector in a single write call.
  2. Use transactional NTFS to atomically write any amount of data.
  3. Write to a .tmp file and atomically rename to the wanted file name
  4. Use some other means to achieve atomicity like databases (maybe even SqlLite, Esent, ...) or the CLFS of Windows.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top