Question

I am looking to store many (500 million - 900 million) small (2kB-9kB) files.

Most general purpose filesystems seem unfit for this as they are either unable to handle the sheer number of files, slow down with many files or have exceedingly large block sizes.

This seems to be a common problem, however all solutions I could find seem to end up just accepting a hit to storage efficiency when storing small files on inodes roughly the same size as themselves.

thus

Are there any filesystems specifically designed to handle hundreds of millions of small files?

or

Is there a production level solution for archiving the small files on the fly and writing one large file to disk?

Was it helpful?

Solution

Our SolFS supports page sizes of as little as 512 bytes and lets you create a virtual file system in a file, thus combining all of your files into one storage file. Performance, though, depends on how files are stored (hierarchically or in one folder), and is in general specific to usage scenarios.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top