Question

I have a CSV file of about 80,000 rows and I am trying to process it in a C# application and make necessary changes to my SQL database. I read the CSV in to the application which doesn't take long then I process it from there.

For each of the records from the CSV I either insert a new record or update an existing one in the database based on a condition(from a SQL select statement). I am just using C# inline SQL commands to do my selects, updates, and inserts. This process is taking about 40 minutes to complete.

I was wondering if there are any best practices for large SQL database updates and insertions that will be able to make the process quicker.

Thanks in advance!

Was it helpful?

Solution

OTHER TIPS

Use begin transaction (before) and commit (after all the work). If you don't explicitly start a transaction, for each of your commands, SQL will implicitly start a transaction and finish it, delaying the process (so 80k rows mean 80k separate transations, whereas all commands should run in one transaction.).

You should try to use backgroundWorker... It helped me when I had to make large queries...

How to use BackgroundWorker

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top