Question

If I receive a request from a Spider, I kick off a Phantom JS process and render back dynamic HTML. (Using a OnExecuting filter and setting the ActionResult)

But the OutputCache filter is in place on this method as well and it is getting in the way!.

E.G:

step 1. Load page with normal user agent. (Output cache caches the URL) step 2. Load page with spider user agent. (the previous cached response is sent to the spider, and my Phantom JS filter never runs)

Was it helpful?

Solution

Use VaryByCustom to force a 'Cache Miss' when the request is from a Search Engine Crawler.

In your Controller/Action:

[OutputCache(VaryByCustom="Crawler")]
public ActionResult Index()
{
     // ...
     return View();
}

Then in your Global.asax:

public override string GetVaryByCustomString(HttpContext context, string arg)
{
    if (arg == "Crawler" && context.Request.Browser.Crawler)
           return Guid.NewGuid().ToString();

    return base.GetVaryByCustomString(context, arg);
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top