Sorry for the undescriptive title, wasn't to sure what to title this :-)

I have written an API that loads required javascript libraries from a directory on my server. The directories have a specific format, the only thing that can differ is the file name format within each libraries' directory.

The Directory Format

_js/_source/_library_name_here/file_name_here.js

e.g.

/_js/_source/_fancybox/jQuery.lightbox-0.5.js

The array (currently hardcoded)

At the moment I have the separate libraries stored in an array (hardcoded) like so:

$js_libraries = array(
    'fancybox'       => '/_js/_source/_fancybox/jQuery.lightbox-0.5.js',
    'something_else' => '/_js/_source/_something_else/jQuery.something.js'
);

Potential Alternative

This API would be a lot more dynamic if the $js_libraries array was built automatically from scanning the '_source' directory and populating the array like that. I would do this with something like (not tested, just an example!):

function gather_files($directory){
    $files_and_folders = scandir($directory);
    foreach($files_and_folders as $value){
        if($value != '.' && $value != '..'){
            if(is_dir($directory.'/'.$value)){
                listFolderFiles($dir.'/'.$ff);
            }
        }
    }
}

$js_libraries = gather_files(dirname(__FILE__));

The above is not completed, just wanted to demonstrate what I mean by building the array automatically based on the contents of the directory

My Question

Quite simply, the key of this API is speed as it is returning packed/minified (by PHP on the fly) javascript files to an HTML page and therefore cannot have any lag as this will delay the initiation of the page. What I would like to know, is, will the automatic method be noticable slower when there are a lot of libraries in this directory? Should I just stick with the hardcoded array?

有帮助吗?

解决方案

Always benchmark

For actual speed, listing deep filesystem hierarchies are probably more IO bound than anything else, but please don't believe me, or don't believe your own intuition, always benchmark.

As for optimization: Speed is not the only option, you could go for low memory usage, and use the GlobIterator class or combine the other RecursiveDirectoryIterator, a custom subclass of FilterIterator and RecursiveIteratorIterator to get an object that can be used in foreach and returns the desired files, one at the time.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top