Question

Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):

...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...

The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.

There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.

Was it helpful?

Solution

Change your php upload max size.

Do you know where your php.ini file is?

First of all, try putting this file into your web root:

phpinfo.php

( see http://php.net/manual/en/function.phpinfo.php )

containing:

<?php

phpinfo();

?>

Then navigate to http://www.yoursite.com/phpinfo.php

Look for "php.ini".

To upload large files you need max_execution_time, post_max_size, upload_max_filesize

Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.

EDIT:

Here is the query I use for the file import:

$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
    ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";

Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).

The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.

I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.

Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top