Domanda

I know SQL Server update stats when 20% of data is changed in a table. It keeps tracks of it by using modification counter which we can see in [modification_counter] column of sys.stats table.

But I can see for one of my key table in the database that stats are getting updated much before they reach 20% data modification threshold. Just wanted to know is there any thing else which will force SQL Server to update stats automatically.

Why I am worried is that when SQL Server update stats automatically, it is using 5% sampling rate which might cause creation of bad / inaccurate execution plan. (need confirmation on that is totally different issue).

So I want to know why SQL Server is so often updating stats?

È stato utile?

Soluzione

For the large tables, we require to update 20% of a row to auto-update the statistics. For example, a table with 1 million rows requires 200,000 rows updates. It might not be suitable for the query optimizer to generate an efficient execution plan.

SQL Server 2016 onwards, it uses dynamic statistics update threshold, and it adjusts automatically according to the number of rows in the table.

Threshold = √((1000)*Current table cardinality)

For example, a table with one million rows we can use the formula to calculate the number of updates after which SQL Server will automatically update statistics.

Threshold = √(1000*1000000) = 31622

SQL Server updates the statistics after the approx. 31622 modifications in the object.

Note: the database compatibility level should be 130 or above to use this dynamic threshold statistics calculations.

Regarding the sampling rate, yes - automatic statistics updates do not do full scan, they do % sampling which might not always be optimal... you have to implement your own solution that will do more frequent statistics update with fullscan. Check Ola Hallengren's maintenance scripts

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a dba.stackexchange
scroll top