Question

In SharePoint 2013 farm we have a Search Service Application running. Due to a very long time required to run a full crawl, decision has been made few months ago to not to run full crawl regularly and keep only daily incremental crawls.

I want to justify to IT management that full crawl should be running regularly. This will help to get some resources to improve crawl performance.

The most "painful" consequence of not running full crawl in that farm I see that some items might be dropped from the index due to repeating incremental crawl errors. So I need to get an estimation on how many items are really missing from search index. Can anyone suggest how this can be done farm-wide, apart from checking/querying every item from crawl error log manually?

Was it helpful?

Solution

A Full Crawl should not be running with any regularity. It should run when you've performed an Index Reset, a rebuild of the SSA, or find a catastrophic failure to index items. A Full Crawl generally won't have any more success than an Incremental with regards to indexing errors (after all, it runs the same process).

Full Crawls decrease the freshness of the index. It's just bad practice in environments with a large index. You can get away with it in smaller environments due to how quickly the indexing occurs, but when your index freshness drops because the Full Crawl takes >3+ hours, end users start getting upset (around here, anyways!).

You can use reflection to get the crawl log. See PowerShell - Search Crawl History which outlines the required PowerShell.

Licensed under: CC-BY-SA with attribution
Not affiliated with sharepoint.stackexchange
scroll top