Question

D2010 Pro, Win Vista 32bit:

I need to parse a directory and its subfolders for files and read the file information (name, type, size, date modified ... the Windows Explorer columns) into memory. So far I have considered using an in-memory table for this task (seems to be the obvious choice), but I'm still hesitant because of the overhead. A TDictionary of objects might also be an option, but I'm not sure. Most important is speed and the ability to perform a sort on columns - ascending or descending.

Would really appreciate your thoughts and suggestions.

Thanks -Phil

Was it helpful?

Solution

Using an in-memory database is one option, but it seems quite a heavy duty solution for what is really quite a simple problem. Another option to consider is to use TList<TFileDetails> where TFileDetails is a record with the various file details in.

As for sorting, the most efficient approach is to would maintain an index for each column that you wanted to sort by. So, an index is a simple array of integers that represented the order of the records when sorted by a particular column. So, an index array of [1, 2, 0] would mean that the first item was index 1, the second was index 2 and the third was index 0. Doing this means that you only need to sort on each column if and when you need to, and only do so once.

That said, you only even need to maintain separate index arrays if you have a lot of files. You may find that performance is acceptable by simply sorting on demand when ever you need to. I'm sure that's what Explorer does.

I can't give you advice on which option is right for your problem because I don't know you performance constraints. But to summarise, here are the main options in order increasing complexity:

  1. Use TList<TFileDetails> and re-sort the list on demand.
  2. Use TList<TFileDetails> and build and retain index arrays on demand.
  3. Use an in-memory database.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top