Question

I have a pipe delimited file that is too large to open in Excel. I'm trying to import this file into MSSQL using the import wizard in SSMS.

Normally when I do this, I open the file in Excel and use an array function =MAX(LEN(An:Annnn)) to get the max length of each column. Then I use that to specify the size of each field in my table.

This file is too large to open in Excel and SQL doesn't check all of the data to give an accurate suggestion (I think it's a crazy small sample like 200 records).

Anyone have a solution to this (I'm not opposed to doing something in Linux especially if it's free).

Thanks in advance for any help.

Was it helpful?

Solution

When I import text data into a database, typically I first read the data into a staging table where are the columns are long-enough character fields (say varchar(8000)).

Then, I load from the staging table into the final table:

create table RealTable (
    RealTableId int identity(1, 1) primary key,
    Column1 int,
    Column2 datetime,
    Column3 varchar(12),
    . . .
);

insert into RealTable(<all columns but id>)
    select (case when column1 not like '[^0-9]' then cast(column1 as int) end),
           (case when isdate(column2) = 1 then cast(column2 as datetime),
           . . .

I find it much easier to debug type issues inside the database rather than when inserting into the database.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top