Question

I have seen this show up several places in code, never with an explanation, just a cryptic comment above it (Declaration and execution included for an idea of context. It's just a standard procedure of running a SqlCommand):

//SqlCommand cmd = new SqlCommand();
//cmd.ExecuteReader();
//Read off the results

//Cancel the command. This improves query time.
cmd.Cancel ();

Basically, after finishing a query, it goes back and cancels it, claiming some performance boost. I suppose you might get some memory back when it goes and frees up the XmlReader, but usually it's about to go out of scope anyways.

I've never bothered with it before, but it's finally showed up in some code I'm reviewing. Does canceling a SqlCommand after running it in the code actually somehow speed it up, or is this just some weird programmer superstition?

Was it helpful?

Solution

According to MSDN, this is correct.

The Close method fills in the values for output parameters, return values and RecordsAffected, increasing the time that it takes to close a SqlDataReader that was used to process a large or complex query. When the return values and the number of records affected by a query are not significant, the time that it takes to close the SqlDataReader can be reduced by calling the Cancel method of the associated SqlCommand object before calling the Close method.

Weird!

OTHER TIPS

Calling Cancel gives a potentially MASSIVE performance improvement if your call to ExecuteReader returns a large number of rows, and you don't read all the rows.

To illustrate, let's say a query returns a million rows, and you close the reader after reading just the first 1000 rows. If you fail to call Cancel before closing the reader, the Close method will block while it internally enumerates through the remaining 999,000 rows

Try it and see!

Our Tech team at Cinchcast did some benchmarking, and we have found that adding the cmd.Cancel() actually slows it down.

We have a DALC call that gets a list of episodes for a host. We ran it 1000 times and got the average response time to return 10 episodes.

So with returning 10 shows Average With Cancel: 0.069s Average Without Cancel : 0.026s

Pretty significantly slower when running with returning 10 episode.

So, I tried again with returning 100 episodes to see if a larger result set makes a difference.

So with returning 100 shows on each call Average With Cancel: 0.132s Average Without Cancel : 0.122s

So this time the difference in time was much less. It is still faster though without using the Cancel for our usual use cases.

In your example, you open the reader, read all the rows, and the Cancel the command, but you didn't show where the reader was being closed.

Make sure the canceling happens before the Dispose/Close. For example, you wouldn't get a performance boost in this example (real code in production, unfortunately):

using (var rdr = cmd.ExecuteReader (CommandBehavior.Default))
{
   retval = DocumentDir.DBRead (rdr);
}

// Optimization.  Allows reader to close more quickly.... NOT!
cmd.Cancel ();  // bad!

Too bad it's already closed by the Using Statement!

This is how it should read to realize the potential benefit:

using (var rdr = cmd.ExecuteReader (CommandBehavior.Default))
{
   retval = DocumentDir.DBRead (rdr);

   // Optimization.  Allows reader to close more quickly.
   cmd.Cancel ();
}

From MSDN SqlCommand.Cancel:

In some, rare, cases, if you call ExecuteReader then call Close (implicitily or explicitly) before calling Cancel, and then call Cancel, the cancel command will not be sent to SQL Server and the result set can continue to stream after you call Close. To avoid this, make sure that you call Cancel before closing the reader or connection.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top