Question

Short story - I have a process that creates some sitecollection, so to test it I keep deleting and recreating the same site over and over. The process includes a check with the istruction SPSite.Exits, done to check if the sitecollection is present or should be created.

Sometime, soon after I delete the site collection, SPSite.Exists will report it as still available. Is this a cache related problem?

Was it helpful?

Solution

I had this same issue and tried a HTTP request & check for results method (example here) as a fix, it works but was somewhat slow for checking a large number of sites at once. I ended up using the invalidate sites cache function as above like this:

public bool SPSiteReallyExists(string url) {
    SPSite.InvalidateCacheEntry(new Uri(url), Guid.Empty);
    return SPSite.Exists(uri);
}

This code has been in production for many months and there has been no issues with SPSite.Exists returning true when it should return false after a site deletion, usually though the ui, but sometimes also through c# API.

I have not seen any performance impact with the use of this call, and it is called allot as I never use .Exists() and always use this wrapper. Logically I don't think there would be any as it only invalidates the cache for that one site. However the SharePoint API doesn't always follow logic...

OTHER TIPS

Seems indeed a cache problem. Looking at the implementation of the Exists method, we can see that it internally creates a new instance of an SPSite object to check if it exists.

SPSite theSite = null;
try
{
    theSite = new SPSite(uri.OriginalString);
}
catch (FileNotFoundException)
{
    // do nothing, just leave the instance null. The rest of the code will detect this.
}

After that, some of the property of the instance are evaluated (HostHeaderIsSiteName and some of the url properties).

Problem is that the SPSite constructor utilize an internal cache to set the object properties. Most of the calls seems to come up to the SPSiteCache class and some of its methods (for example: SPSiteCache.LookupHostHeaderSite) => this would mean that if the site is referenced in the cache we can get invalid results.

After some time lost on MSDN I have found some reference to a cache clear method.

SPSite.InvalidateCacheEntry(new Uri(siteCollectionUrl), Guid.Empty);

It seems that this method just calls the SPSiteCache class internally

public static bool InvalidateCacheEntry(Uri uri, Guid siteId)
{
    return SPSiteCache.InvalidateCacheEntry(uri, siteId);
}

I have used this instruction just before the call to Exist to ensure a valid result. That way, the problem was resolved.

Notes:

  1. I don't know what is the resource cost of this solution. If performance are critical, please ensure that clearing the cache doesn't bring a performance hit.
  2. someone sugested that the problem can be related to the gradual site deletion job. I don't belive it is, but you never know. As I already said, the timer job could as well call the istruction above internally. That said, by calling the InvalidateCacheEntry method I never needed to manually run the job - and I am creating sites pretty fast. Also consider that running a timer job could be problematic in some context - so give the above method a try if you are experiencing the same problem.

It's not a cache problem. In 2010 they implemented a timer job to do the actual site collection deletion called Gradual Site Deletion to help manage database locks when deleting large site collections. The actual deletion usually occurs very quickly, but it can take up to a few minutes.

Licensed under: CC-BY-SA with attribution
Not affiliated with sharepoint.stackexchange
scroll top