Question

I am working on application optimization, I have optimized my query to take just 5 ms (Total Execution time = ~350 ms) from 17+ secs. But iterating over the SQLDataReader records for 60k records takes about 900 ms. Is there any way I can reduce the 900 ms time?

Here is the sample code I used...

SqlConnection conn = new SqlConnection("data source=XXX;initial catalog=XXX;integrated security=True");
conn.Open();
SqlCommand cmd = new SqlCommand("uspOptimizedSP", conn);
cmd.CommandType = CommandType.StoredProcedure;
var reader = cmd.ExecuteReader();

int i = 0;
Stopwatch sw = new Stopwatch();
sw.Start();
while (reader.Read())
{
++i;

}
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds); // This results in ~900 ms

Elapsed Milliseconds is from 800ms to 950ms

Other details

  • Data Set Size = 714 MB for 60k records.
  • I have used AutoMapper but it is taking 3+ seconds to convert data from IDataReader to POCO class
Was it helpful?

Solution

The Datareader reads the data as if it were a stream, retrieving the rows from the database as they become available by the executing query. That is probably why it takes time sice you are copying 714 MB of data. You can not compare it to iterating over an in memory collection.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top