Question

Risk Factors for File Fragmentation include mostly full Disks and repeated file appends. What are other risk factors for file fragmentation? How would one make a program using common languages like C++/C#/VB/VB.NET to work with files & make new files with the goal of increasing file fragmentation?

WinXP / NTFS is the target

Edit: Would something like this be a good approach? Hard Drive free space = FreeMB_atStart

  • Would creating files of say 10MB to fill 90% of the remaining hard drive space
  • Deleting every 3rd created file
  • making file of size FreeMB_atStart * .92 / 3
Was it helpful?

Solution

This should achieve at least some level of fragmentation on most file systems:

  1. Write numerous small files,
  2. Delete some at random files,
  3. Writing a large file, byte-by-byte.

Writing it byte-by-byte is important, because otherwise if the file system is intelligent, it can just write the large file to a single contiguous place.

OTHER TIPS

Another possibility would be to write several files simultaneously byte-by-byte. This would probably have more effect.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top