Question

I have a large inserting job to perform, say 300000 Inserts.

If I do it the legacy way, I just write a SQL string with blocks of 100 Insert statements, and perform an executeCommand against the DB (each 100 records).

That lends to some 100 inserts per 3 seconds or so.

Now of course there are issue with single quotes and CrLf's within the inserted values. So rather than writing code to double the single quotes and so on, since I'm lazy I have a go with Linq InsertOnSubmit and one context.SublitChanges each other 100 rows.

And that take some 20x more times than the legacy way!!!

Why?

Was it helpful?

Solution

You're not using the right tool for the job. LINQ-to-SQL and most other ORMs (at least Entity Framework and NHibernate) are meant for OLTP scenarios, they are not meant for bulk data operations and will perform slowly when used for bulk data operations.

You should be using SqlBulkCopy.

OTHER TIPS

I had the same issues, with InsertOnSubmit() taking a long time.

However, using the DataTableHelper class (downloadable from the link below), and changing just 1 or 2 lines of your code, you can easily use a Bulk Insert instead.

Bulk-inserts

For example:

const int RECORDS_TO_INSERT = 5000;

List<Product> recordsToBeInserted = new List<Product>();
using (NorthwindDataContext dc = new NorthwindDataContext())
{
    for (int n = 0; n < RECORDS_TO_INSERT; n++)
    {
        Product newProduct = new Product()
        {
            ProductName = "Product " + n.ToString(),
            UnitPrice = 3999,
            UnitsInStock = 2,
            UnitsOnOrder = 0,
            Discontinued = false
        };
        recordsToBeInserted.Add(newProduct);
    }
    // Insert this List<> of records into the [Products] table in our database, using a Bulk Insert
    DataTableHelper.BulkCopyToDatabase(recordsToBeInserted, "Products", dc);
}

Hope this helps.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top