質問

A real head scratcher this one - any help would be gratefully received.

I have been using the zipArchive library to extract csv files from a zip.

Oddly, it will only extract 40 files properly. Files with an index 40 or greater appear as empty files, files 0-39 extract perfectly.

This is the case regardless of the combination of files and the size of the files. I have tried removing the 39th file and the 40th file from the zip and the problem just moves. No matter what combination of files I use, it extracts 40 files properly and then just dies.

Thanks to this forum, I have tried using Shell Exec with exactly the same outcome. I have also tried extracting the files one at a time, using a zip with only the csv files and zips with multiple different file types. Always only 40 are extracted.

This is such a suspiciously round number that it must surely be a setting somewhere that I cannot find or otherwise a bug.

For what it is worth, the unzipping code is below:

    $zip = new ZipArchive;
    if ($zip->open('Directory/zipname.zip') == TRUE) {
     for ($i = 0; $i < $zip->numFiles; $i++) {
        $filename = $zip->getNameIndex($i);
         if(substr(strrchr($filename,'.'),1,3)=="csv")
        {
         $zip->extractTo('Directory/',$filename);
        }
        }
    }

I have also tried the following which uses a different method with the same results :-(

$zip2 = new ZipArchive;
if ($zip2->open('Directory/zipname.zip') == TRUE) {
 for ($i = 0; $i < $zip2->numFiles; $i++) {
     $filename = $zip2->getNameIndex($i);
     if(substr(strrchr($filename,'.'),1,3)=="csv")
            {
            $content=$zip2->getFromIndex($i);
            $thefile=fopen("directory/filename","w");
            fwrite($thefile,$content);
            fclose($thefile);
            }


 }
}
役に立ちましたか?

解決

FINALLY found the answer. Thanks to all who tried to help.

For others suffering in the same way, the problem was solved by increasing the server disk allocation. I was on a rather old plan which had served well until the advent of a new national database that increased the amount of storage 10 fold. A measly 100MB allowance meant that the server would only do so much before spitting the dummy.

Interestingly, a similar problem occurred with trying other file operations - it seemed to be limited to 40 file operations per script, regardless of the size of each file.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top