Question

The theory says that we can split a file in N fragments and after that, we can recover the file with only P of those fragments. Where P < N.

I'm trying to build something like Symform, just a subset of that actually.

Each block is shred into 64 fragments, with 32 parity fragments added for redundancy when stored in the cloud.

Is there an open source solution that I can re-use it? Or maybe some link that will describe the algorithm in more detail?

Was it helpful?

Solution

This is a very nice C and python solution. I'm still looking for something similar in Java.

"Generate redundant blocks of information such that if some of the blocks are lost then the original data can be recovered from the remaining blocks. This package includes command-line tools, C API, Python API, and Haskell API."

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top