Question

I am looking for a stored procedure code that will open a text file, read in several thousand lines, and add the code to a table in the database. Is there a simple way to implement this in T-SQL?

Was it helpful?

Solution

If the file is ready to load "as-is" (no data transformations or complex mappings required), you can use the Bulk Insert command:

CREATE PROC dbo.uspImportTextFile

AS

BULK INSERT Tablename FROM 'C:\ImportFile.txt' WITH ( FIELDTERMINATOR ='|', FIRSTROW = 2 )

http://msdn.microsoft.com/en-us/library/ms188365.aspx

OTHER TIPS

I would recommend looking at using SSIS. It's designed to do this sort of thing (especially if you need to do it on a regular basis).

Here is a good link that goes over reading a text file and inserting into the DB.

The most efficient way of inserting many records into a table is to use BULK INSERT (I believe that this is what the BCP Utility uses, and so it should be just as fast).

BULK INSERT is optimised for inserting large quantities of data and is intended to be used when the performance of a simple INSERT statement simply won't do.

If BULK INSERT isn't what you are after then you might want to take a look at the following article for a more straightforward technique:

Linked in the article is a stored procedure uftReadFileAsTable which seems like it should be versatile enough to achieve what you are after.

If it isn't then you can at least use the stored procedure as an example of how to read files in SQL (it uses OLE / the Scripting.FileSystemObject)

why don't use try user functions? This way you can use .NET to access and handle your file.

Check out this post

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top