Question

I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.

I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".

Any ideas on how I can get this SQL script to run?

Was it helpful?

Solution

Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.

Another option is to look at some of the other ways to import CSV data directly.

OTHER TIPS

use the sqlcmd tool to execute the file..

sqlcmd -S myServer\instanceName -i C:\myScript.sql

In case your have an unexplained "script error" for large sql files (> 100MB) which includes several INSERT, just replace "INSERT INTO" by "GO INSERT INTO" in your file, which will reduce size of transaction.

This tool (Big SQL Script File Runner) on CodePlex will run any size script file with log and GUI.

Adding to Gulzar Nazim's answer: If you still get a failure, try specifying the codepage of your SQL file using option -f:

sqlcmd -S myServer\instanceName -d databaseName -i C:\myScript.sql -f 65001

I was trying to import a .dump file from SQLite (UTF-8 by default), and sqlcmd kept throwing an error after encountering the first special character. -f 65001 fixed it for me.

Why not just use DTS to import the CSV file directly?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top