Question

I'm having trouble uploading large excel files (300mb+) using a data reader. With this code I open the excel file and load each row separately. Using breakpoints I noticed that that one statement takes 30s+. The memory usage also has a steady increase. Specifying the CommandBehavior parameter (e.g. SequentialAccess) of the ExecuteReader() method has no effect.

What am I doing wrong here? Are there alternative ways of processing large (excel) files?

const string inputFilePath = @"C:\largefile.xlsx";
const string connectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Extended Properties=\"Excel 12.0;IMEX=1;HDR=YES;\";Data Source=" + inputFilePath;
using (var connection = new OleDbConnection(connectionString))
{
    connection.Open();
    var command = new OleDbCommand("largesheet$", connection) {CommandType = CommandType.TableDirect};
    var reader = command.ExecuteReader(); // <-- Completely loads file/sheet into memory
    while (reader.HasRows)
    {
        reader.Read();
    }
    connection.Close();
}
Was it helpful?

Solution

can you try to load the file in memory with this :

Stream exportData = new MemoryStream(byte[] fileBuffer);
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top