Question

I'm working on an application that redirects users to upgrade browser if they are not on our browser list.

My goal is to create an exception to detect if they are a crawler, based on their UserAgent string.

At this point, I'm getting the a message ..."no definition or extension method for .ToLower"

Here is my code:

    private bool IsValidCrawler(HttpRequestBase request)
    {
        bool isCrawler = true;

        switch (request.Browser.Crawler.ToLower())  
        {
            case "googlebot":
            case "bingbot":
            case "yahoo!":
            case "facebookexternalhit":
            case "facebookplatform":
                break;
        }

        return isCrawler;
    }

Can anyone point me to where I have gone wrong?

Was it helpful?

Solution

If you look at the documentation for the Crawler property (http://msdn.microsoft.com/en-us/library/system.web.configuration.httpcapabilitiesbase.crawler(v=vs.110).aspx) you'll notice it's a boolean type.

The property itself will let you know if the request is coming from a known crawler. You can try the following for the time being. Leaving the method so you don't have to change too much.

private bool IsValidCrawler(HttpRequestBase request)
{
    bool isCrawler = request.Browser.Crawler;

    return isCrawler;
}

OTHER TIPS

There are thousands of crawlers, the user agent parser included .NET framework is able to handle only few of them and it doesn't keep an updated list of them.

Install this .nuget package, it provides a semantic parser and the library is very active.

You can initialize the parser with this code:

public static class YauaaSingleton
{
    private static UserAgentAnalyzer.UserAgentAnalyzerBuilder Builder { get; }

    private static readonly Lazy<UserAgentAnalyzer> analyzer = new Lazy<UserAgentAnalyzer> (() => Builder.Build());

    public static UserAgentAnalyzer Analyzer
    {
        get
        {
            return analyzer.Value;
        }
    }

    static YauaaSingleton()
    {
        Builder = UserAgentAnalyzer.NewBuilder();
        Builder.DropTests();
        Builder.DelayInitialization();
        Builder.WithCache(100);
        Builder.HideMatcherLoadStats();
        Builder.WithAllFields();
    }
}

Then very easy:

  private bool IsValidCrawler(HttpRequestBase request)
  {
      var ua = YauaaSingleton.Analyzer.Parse(request.UserAgent);
      var devideClass = UserAgentClassifier.GetDeviceClass(ua);
      if (devideClass == DeviceClass.Robot || devideClass == DeviceClass.RobotMobile || devideClass == DeviceClass.RobotImitator)
          return true;
      return false;
  }

Robot: Normal crawler
RobotMobile: A crawler that emulates a mobile device
RobotImitator: This is not a crawler, but something that emulates a crawler

if you want you can also use:

var isHuman = UserAgentClassifier.IsHuman(ua); In this case you handle also hacked user agents and other cases.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top