Question

Is it true that the SQL Server is not using indexes that are fragmented?

How this could be possible and if it is, how the optimizer is making the decision when to use a index or not?

I have searched around but not been able to find a formula/rules that are making the SQL Server to ignore a particular index.

EDIT:

Actually, I have found the following statement in this article:

High fragmentation – If an index is fragmented over 40%, the optimizer will probably ignore the index because it's more costly to search a fragmented index than to perform a table scan.

So, it seems that optimizer is ignoring fragmented indexes after all. Can anyone brings more light how this is done?

Was it helpful?

Solution

SQL server does not consider fragmentation during the index selection process, the following simple talk article does a good job of explaining the workings of SQL server's index selection mechanism:

Index Selection and the Query Optimizer

OTHER TIPS

I have never heard of indexes that are badly fragmented. I don't think the SQL server optimizer looks at fragmentation to not use an index.

To determine whether an index is used SQL server uses statistics. If you think an index could be used but it isn't used by the server, your statistics might be wrong.

If you want to know more about statistics: http://blog.idera.com/sql-server/understanding-sql-server-statistics/ More info about index fragmentation and what can be done about it: http://www.brentozar.com/archive/2012/08/sql-server-index-fragmentation/

Edit: I've read the article and it says 'probably ignore'. When it is ignored will be, according to me, again, based on the statistics.

Example: If the statistics indicate that only one row (out of thousands) has the searched value then I think it will use the index no matter how fragmented this is. It will only have to read like 3 pages instead of the entire table. If the statistics indicate that 50% of the values are being searched, then a table scan is used. Whether an index or a table scan is used, is determined with statistics. A high fragmentation will affect the speed of a (partial) index scan and thus result in the optimizer choosing a table scan sooner than with an unfragmented index. So although it is a parameter that affects the quality of your index I don't think an index is not being just just because it's badly fragmented.

Nevertheless don't let your indexes become fragmented :). Microsoft suggest reorganizing any index with fragmentation > 5% and rebuilding one with fragmentation > 30%. (http://support.microsoft.com/kb/2755960) But this also depends on how many inserts are done in this table.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top