Question

I am facing a major problem in my application. While reading a big excel file having more then 9 hundred thousand records through oledb.jet 12.0 provider and filling it up in a dataset from the returned value after some time system through me an exception with the message outofmemory.

i tried so hard to find out some solution over to that but till now not getting any luck.

Please help me in that issue if any one has some idea about the same.

Code :

MyConn = New OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & FileSource & ";Extended Properties=""Excel 12.0;IMEX=1""")
            MyConn.Open()
            'get the table schema information to retrive sheet name
            Dim schemaTable As DataTable = MyConn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, New Object() {Nothing, Nothing, Nothing, "TABLE"})
            For Each dtrow As DataRow In schemaTable.Rows
                Dim iUploadStatus As Integer = 0               
                sheetName = String.Empty
                sheet = dtrow("TABLE_NAME").ToString()
                'to skip the duplicate sheet being fetched in the schema table.
                If sheet.Contains("_xlnm#_FilterDatabase") Or sheet.Contains("_FilterDatabase") Then
                    Continue For
                End If
                MyCommand = New OleDbDataAdapter _
                            ("select * from [" + sheet + "] ", MyConn)
                MyCommand.TableMappings.Add("Table", "TestTable")
                DtSet = New DataSet
                MyCommand.Fill(DtSet)

                  Using destinationConnection As New SqlConnection(Conn)
                                        ' open the connection
                                        destinationConnection.Open()
                                        Using bulkCopy As New SqlBulkCopy(Conn)
                                            ' column mappings
                                            bulkCopy.ColumnMappings.Add(P1, ColProdNum)
                                            bulkCopy.ColumnMappings.Add(P2, ColProdDesc)
                                            bulkCopy.ColumnMappings.Add(P3, ColListPrice)
                                            bulkCopy.ColumnMappings.Add(P4, ColNetPrice)
                                            bulkCopy.BatchSize = 2000
                                            bulkCopy.NotifyAfter = 2000
                                            bulkCopy.DestinationTableName = "tabDestination"
                                            bulkCopy.WriteToServer(DtSet.Tables(0))
                                        End Using
                                    End Using
Was it helpful?

Solution

The correction's which i made are below:

  1. I just change the Data Reading process from excel to Datatable in chunks ( of 100000 records).
  2. On each and every reading process thread i used SQLBulkCopy to through the Data to the destination table.

  3. And after each looping process of uploading (100000 records) clear the rows in datatable objects.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top