質問

I have a problem importing a large CSV file with 400.000 rows into a database. These final import is done by an foreign import script which does some magic in a black box and i cannot influence it. This script is causing the memory_limit problems, but as i said, i have to use it. My problem is now to run the whole import. What i`ve tried is the following:

loop with while over original csv
read 1000 rows from csv
create a new file with these rows
give this new csv to doImport() function of the importer

but the foreign script still causes the memory_limit problems. Here is some pseude code

$csvLib = new foo();
$foreignImporter = new bar();

while ($thausendRows) {

    $tmpFile = $csvClass->writeCSV($thausendRows);
    $foreignImporter->doImport($tmpFile);

}

In sheer desperation I also tried to instanciate the csvLib and foreignImporter inside the loop to null and unset them after the loop, but that didn´t change nothing.

Any Ideas how to run the doImport in small chunks so that it don´t breaks? I increased the memory limit up to 2G on my local machine and it got the first 100.000 rows importet. But that is no option at all.

正しい解決策はありません

他のヒント

OK, I found a solution for my Problem. The memory leaking foreign part is outsourced into an own script. My part of the script reads the CSV. Now I loop over the rows and every 1000 rows i write a tmp CSV and call the foreign part in its own script with this tmpCSV. Instead of $foreignImporter->doImport($tmpFile); I do passthrough('script.php'). Thats all. Easy...if you know :)

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top