Loop through the file and insert using a prepared query. Prepared querys should be quicker too, since the DB doesn't have to recompile every SQL string you send it. That will be more noticeable when you have thousands and thousands of lines.
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
// read file contents to an array
$lines = file('file.csv');
// insert each line
foreach ($lines as $line) {
// see manual to specify $delimter, $enclousure, or $more
$cols = str_getcsv($lines);
$stmt->execute($cols);
}
That'll work. Since we're using file()
, the script can consume a lot of memory if your CSV file is HUGE. To make better use of resources, do the following to keep only one line in memory at a time:
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
$handle = fopen('test.csv', 'r');
while ($cols = fgetcsv($handle)) {
$stmt->execute($cols);
}