I have very big csv files which contain transaction log of the bank. Valid transaction id are enclosed by <>. The format is like below
012,10/1/2013,<13273288163624>JOHN doe .,<01-10-2013>20130930,13273288163624,10/1/2013,"10,580.00",
012,10/1/2013,SI TRANSFER,10/1/2013,"11,330,114.07",
012,10/1/2013,<555395056216>JOHN DOE1,<01-10-2013>,555395056216,10/1/2013,"46,852.00",
012,10/1/2013,<13273708949197>JOHN DOE3 -,<01-10-2013>20130930,13273708949197,10/1/2013,"57,687.00",
The advantage we have is that the line which does not contain '<' and '>' need to be ignored.
Is there any way to ignore those lines of csv to have csv files contents like below?
012,10/1/2013,<13273288163624>JOHN doe .,<01-10-2013>20130930,13273288163624,10/1/2013,"10,580.00",
012,10/1/2013,<555395056216>JOHN DOE1,<01-10-2013>,555395056216,10/1/2013,"46,852.00",
012,10/1/2013,<13273708949197>JOHN DOE3 -,<01-10-2013>20130930,13273708949197,10/1/2013,"57,687.00",
I want to use csv in mssql bulk insert like below
BULK
INSERT nibl
FROM 'd:\2.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW=2
)
GO
Invalid transaction (the line without < and >) have fewer no of columns , which result in failure of bulk insert
i dont want to use loops using fgetcsv beacuse csv size is even greater than 100 mb