Question

Update: Looks like the query does not throw any timeout. The connection is timing out.

This is a sample code for executing a query. Sometimes, while executing time consuming queries, it throws a timeout exception.

I cannot use any of these techniques: 1) Increase timeout. 2) Run it asynchronously with a callback. This needs to run in a synchronous manner.

please suggest any other techinques to keep the connection alive while executing a time consuming query?

private static void CreateCommand(string queryString,
    string connectionString)
{
    using (SqlConnection connection = new SqlConnection(
               connectionString))
    {
        SqlCommand command = new SqlCommand(queryString, connection);
        command.Connection.Open();
        command.ExecuteNonQuery();
    }
}
Was it helpful?

Solution

Since you are using ExecuteNonQuery which does not return any rows, you can try this polling based approach. It executes the query in an asyc manner (without callback) but the application will wait (inside a while loop) until the query is complete. From MSDN. This should solve the timeout problem. Please try it out.

But, I agree with others that you should think more about optimizing the query to perform under 30 seconds.

        IAsyncResult result = command.BeginExecuteNonQuery();

        int count = 0;
        while (!result.IsCompleted)
        {
            Console.WriteLine("Waiting ({0})", count++);
            System.Threading.Thread.Sleep(1000);
        }
        Console.WriteLine("Command complete. Affected {0} rows.",
        command.EndExecuteNonQuery(result));

OTHER TIPS

You should first check your query to see if it's optimized and it isn't somehow running on missing indexes. 30 seconds is allot for most queries, even on large databases if they are properly tuned. If you have solid proof using the query plan that the query can't be executed any faster than that, then you should increase the timeout, there's no other way to keep the connection, that's the purpose of the timeout to terminate the connection if the query doesn't complete in that time frame.

I have to agree with Terrapin.

You have a few options on how to get your time down. First, if your company employs DBAs, I'd recommend asking them for suggestions.

If that's not an option, or if you want to try some other things first here are your three major options:

  1. Break up the query into components that run under the timeout. This is probably the easiest.
  2. Change the query to optimize the access path through the database (generally: hitting an index as closely as you can)
  3. Change or add indices to affect your query's access path.

If you are constrained from using the default process of changing the timeout value you will most likely have to do a lot more work. The following options come to mind

  1. Validate with your DBA's and another code review that you have truly optimized the query as best as possible
  2. Work on the underlying DB structure to see if there is any gain you can get on the DB side, creating/modifying an idex(es).
  3. Divide it into multiple parts, even if this means running procedures with multiple return parameters that simply call another param. (This option is not elegant, and honestly if your code REALLY is going to take this much time I would be going to management and re-discussing the 30 second timeout)

We recently had a similar issue on a SQL Server 2000 database.

During your query, run this query on your master database on the db server and see if there are any locks you should troubleshoot:

select 
  spid,
  db_name(sp.dbid) as DBname,
  blocked as BlockedBy,
  waittime as WaitInMs,
  lastwaittype,
  waitresource,
  cpu,
  physical_io,
  memusage,
  loginame,
  login_time,
  last_batch,
  hostname,
  sql_handle
from sysprocesses sp
where (waittype > 0 and spid > 49) or spid in (select blocked from sysprocesses where blocked > 0)

SQL Server Management Studio 2008 also contains a very cool activity monitor which lets you see the health of your database during your query.

In our case, it was a networkio lock which kept the database busy. It was some legacy VB code which didn't disconnect its result set quick enough.

If you are prohibited from using the features of the data access API to allow a query to last more than 30 seconds, then we need to see the SQL.

The performance gains to be made by optimizing the use of ADO.NET are slight in comparison to the gains of optimizing the SQL.

And you already are using the most efficient method of executing SQL. Other techniques would be mind numbingly slower (although, if you did a quick retrieval of your rows and some really slow client side processing using DataSets, you might be able to get the initial retrieval down to less than 30 seconds, but I doubt it.)

If we knew if you were doing inserts, then maybe you should be using bulk insert. But we don't know the content of your sql.

This is an UGLY hack, but might help solve your problem temporarily until you can fix the real problem

    private static void CreateCommand(string queryString,string connectionString)
    {
        int maxRetries = 3;
        int retries = 0;
        while(true)
        {
            try
            {
                using (SqlConnection connection = new SqlConnection(connectionString))
                {
                    SqlCommand command = new SqlCommand(queryString, connection);
                    command.Connection.Open();
                    command.ExecuteNonQuery();
                }
                break;
            }
            catch (SqlException se)
            {
                if (se.Message.IndexOf("Timeout", StringComparison.InvariantCultureIgnoreCase) == -1)
                    throw; //not a timeout

                if (retries >= maxRetries)
                    throw new Exception( String.Format("Timedout {0} Times", retries),se);

                //or break to throw no error

                retries++;
            }
        }
    }
command.CommandTimeout *= 2;

That will double the default time-out, which is 30 seconds.

Or, put the value for CommandTimeout in a configuration file, so you can adjust it as needed without recompiling.

You should break your query up into multiple chunks that each execute within the timeout period.

If you absolutely cannot increase the timeout, your only option is to reduce the time of the query to execute within the default 30 second timeout.

I tend to dislike increasing the connection/command timeout since in my mind that would be a matter of taking care of the symptom, not the problem

have you thought about breaking the query down into several smaller chunks?

Also, have you ran your query against the Database Engine Tuning Advisor in:

Management Studio > Tools > Database Engine Tuning Advisor

Lastly, could we get a look at the query itself?

cheers

Have you tried wrapping your sql inside a stored procedure, they seem to have better memory management. Have seen timeouts like this before in plan sql statement with internal queries using classic ADO. i.e. select * from (select ....) t inner join somthingTable. Where the internal query was returning a very large number of results.

Other tips 1. Performing reads with the with(nolock) execution hint, it's dirty and I don't recommend it but it will tend to be faster. 2. Also look at the execution plan of the sql your trying to run and reduce the row scanning, the order in which you join tables. 3. look at adding some indexes to your tables for faster reads. 4. I've also found that deleting rows is very expensive, you could try and limit the number of rows per call. 5. Swap @table variables with #temporary tables has also worked for me in the past. 6. You may also have saved bad execution plan (heard, never seen).

Hope this helps

Update: Looks like the query does not throw any timeout. The connection is timing out.

I.o.w., even if you don't execute a query, the connection times out? because there are two time-outs: connection and query. Everybody seems to focus on the query, but if you get connection timeouts, it's a network problem and has nothing to do with the query: the connection first has to be established before a query can be ran, obviously.

It might be worth trying paging the results back.

just set sqlcommand's CommandTimeout property to 0, this will cause the command to wait until the query finishes... eg:

SqlCommand cmd = new SqlCommand(spName,conn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandTimeout = 0;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top