Question

I am using the Postgresql Import tool to populate my development database with data from CSVs.

Is there a way to 'record' what SQL the import tool used to run the import. For example, I go through and select Filename, the encoding, whether there are headers, what the delimiter is, which fields to populate etc.

As there is a bit of 'trial and error', once I have successfully done this, I then want to just copy the SQL and save it later in Evernote so I can run that next time when I need to repopulate the database (truncate then reimport etc). I could then also possibly use that SQL for a ruby rake task so I can run it from the terminal.

Any ideas?

Was it helpful?

Solution

All that PgAdmin-III's "import tool" does is run the COPY command. If you enable log_statement = 'all' in postgresql.conf and examine the server logs you'll see it.

You can instead use the psql \copy command from a script to achieve the same thing.

This won't work with SQL run via other clients, though, as COPY is a special command that switches into a special line-by-line input mode that things like the Ruby Pg gem generally don't support directly. The Pg gem does support COPY via the copy_data method, so you can use it from Ruby code, but then you have to open the CSVs yourself in Ruby, etc. It's pretty simple, and that's how I'd do it if I wanted to use a Rake task.

Another option is to use server-side COPY, where you pass PostgreSQL the path of the file to load directly. This is fastest, but requires the PostgreSQL server to have permission to read the file, and the file to be on the same computer as the server. So it's not very flexible.

I recommend using the Pg gem with copy_data in your Rake script.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top