Question

In many of my scripts I am using sqlite for reporting info and I need first to upload my big table data (millions of csv rows). In the past I have found that .import was quicker than line by line inserting (even using transactions).

Nowadays my scripts implement a method that do system call for sqlite3 db '.import ....'. I wonder if it is possible to call .import from dbd-sqlite. Or it would be better to keep calling insert from system?.

PD: The reason for wanting to call .import from inside dbd-sql is to remove the sqlite3 dependency when my software is installed elsewhere.

Was it helpful?

Solution

.import is a SQLite-specific command, so you won't find a DBI method for it which is independent of the database driver; while any given database engine almost certainly has equivalent functionality, each will implement it differently (e.g. SQLite .import vs MySQL LOAD DATA INFILE, &c.)

If you're looking for true engine independence, you'll need to import your data by means of INSERT queries, which can be relied upon in the simplest case to work more or less equivalently everywhere. However, if the difference in execution time is significant enough, it may be worth your while to write an engine-agnostic interface to the import functionality, with a wrapper around each engine's specific import command, and determining from the currently active database driver (or some other method, depending on your code) which wrapper to invoke at runtime.

OTHER TIPS

if you are not opposed to "shelling out"
perl -e 'system(qq(sqlite3 foo.db ".import file.dat table")) and die $!'

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top