Frage

I´m using the ELMAH to handle the application errors but I´m receiving a lot of errors of crawler access. How can I filter the errors only for user access, no robots.

Best regards Ernesto

War es hilfreich?

Lösung

You should check out Filtering with ELMAH. Filtering lets you write code in c# or scripted through JScript, which can dismiss the exception from being logged. In you case, I would write something like this:

void ErrorLog_Filtering(object sender, ExceptionFilterEventArgs e)
{
    if (HttpContext.Current.Request.UserAgent.Contains("bot"))
        e.Dismiss();
}

Checking for "bot" is a very simple example. There are lists containing user agents from bots all over the web. Evil bots typically don't reveal themselves through it's user agent, why tracking down requests from them will be a hard one.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top