I'm importing IIS logfiles into a SQL database but for some reason not every entry makes it into the database. For instance if the logfile has 1,000,000 entries logparser does say that it processed 1,000,000 entries but if I do a SELECT COUNT(*) in the database I don't have 1,000,000 rows of data. Has anyone seen this issue before ?

No correct solution


Is the data being truncated on input, so that row is discarded?

How you are importing (bcp, BUK INSERT, SSIS) decides if it is true bulk load or row by row. Error handling settings (ignore, break, max number etc) determine what happens on truncation.

maybe it's counting the header rows (of which there are normally more than just what's at the top of the file) in the first count? Then they get dropped in the import to the DB? just a guess.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow