Domanda

I got a php script running every 1 hour on my server (a thread). What this thread is doing is, the user can upload excel files with data, the thread compares the data in the excel file with that in the database and does something (not relevant some inserts in the database if the found rows dont exist). To read the excel file i call the :

$data = new Spreadsheet_Excel_Reader();
$data->read($file);

the user can upload more xls files, on the server called: File_1.xls, File_2.xls ,File_3.xls i read the first file get all the data do all the compares do all the inserts this can take a while sometimes its over 6000 rows in excel. I noticed that if the thread runs to long, for example he can insert file 1 and file 2 and file to is a big one like 6000+ rows i get an Abort in my error log. I think it is the excel readers fault. Any ideas?

È stato utile?

Soluzione

As I understand your Server does not have enough memory (RAM) to hold multiple file pointer and big array data. Increase the memory of your server or try different algorithm to read data from XLS to do the compare insert.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top