Question

One of our customers has just upgraded to a new server.

For a particular stored procedure the first time you execute it, is taking over three minutes to run. Subsequent runs are less than 1 second.

This leads me to believe that the initial three minutes are primarily taken up calculating the execution plan. Subsequent runs then just use the cached plan and run instantaneously.

On our test databases it is taking about 5 seconds to calculate the plan for the same procedure.

I don't see anything terrible in the plan itself - although I don't imagine its relevant as the plan shows how long it takes to run the query, not calculate itself.

The server is a 16 core with 24 gb memory. No heavy CPU or memory load occurs.

What is it that could be causing such a slow calculation only on a particular database?

What steps can I take to find the cause of the problem?

Edit

So I have managed to access the server and have run the query with SET SHOWPLAN_XML ON.

I can confirm that the CompileTime for the query is taking up 99% of the query execution time. The StatementOptmEarlyAbortReason is "TimeOut", on our test database with a copy of their database the reason is MemoryLimitExceeded.

Was it helpful?

Solution

I hate to answer my own question, especially since I have had so much help from others to work out the solution, but here goes.

The problem was due to some dud statistics in the database. Looking at the execution plan, the optimiser was expecting 11.5tb of data returned from the query. In reality is was receiving 87kb. I now know that huge mismatches between expected and actual rows returned is a sign that the statistics are out of date.

Simply running

exec sp_updatestats

forces the database to update the statistics for all tables.

This has taken the query execution time down from 3 minutes to 6 seconds. Everyones a winner!

Thanks for all the help guys. :0)

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top