Domanda

So, I have been tasked with figuring out why our SharePoint 2010 Foundations environment is no longer successfully crawling our sites. Unfortunately, as I am not the original "mastermind" behind the site nor do I have a surplus of experience with this side of SharePoint, I'm coming up short of a solution.

Here's what I have gathered. Below is an error that I have found. We only have two start addresses, but both are coming back with the following error in Event Viewer:

Event ID: 14

The start address http://sharepoint.domain.com (sample address format for you guys) cannot be crawled.

Context: Application 'Search_Service_Application', Catalog 'Portal_Content'

Details: Item not crawled due to one of the following reasons: Preventive crawl rule; Specified content source hops/depth exceeded; URL has query string parameter; Required protocol handler not found; Preventive robots directive. (0x80040d07)

So, I've tried a number of things including verifying that the Content Access Account has the correct permissions, the sites are set to allow crawling, recreated the content sources, and created a crawl rule specifically to include these two sites. Additionally, I've attempted renaming the robots.txt file to .old with no such luck. In regards to the remaining reasons listed by the error, I'm not entirely certain what steps can be taken to troubleshoot.

This has been going on for some time now so, unfortunately, I cannot align this issue with a change on the server that could have triggered this. Any and all help is much appreciated!

È stato utile?

Soluzione

Looks like you have a Crawl Rule that is blocking the site or a robots file.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a sharepoint.stackexchange
scroll top