Question

My web applications needs to store and retrieve data files created by the application users (these files are log files that originate from a monitoring device that the app user uses). A typical file is less than 10KB big. Each such file has a creation date attribute. Usually a user would upload and ask to retrieve several files created on adjustment days at one time.

My question is, should I design my file handing code to concatenate several user files on adjacent dates and store them together as one file to optimize server performance? IOW, should I be worried about reducing the number of file fetches? Last, is there a limit, on Linux, to the number of files that can be placed inside a folder?

Thanks, Avi

Was it helpful?

Solution

For fast retrieval you should play around with Folder structure, let it be any OS Windows, Linux etc, when searching happens it checks all file list.

So what you can do is you can have folder structure like 2012 -> 12 -> 20 2012 - year 12 - Month 20 - day

This will help in retrieval fast and also if you can perform indexing it would be great.

Linux or any OS they don't put limit on number of files in folder.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top