Pergunta

I have an issue which has arisen using Google Page Speed Online although I am worried that there may be a bigger picture to this. I ran my site through the online tool, see the results here: https://developers.google.com/pagespeed/#url=www.exclaimer.com&mobile=false. Notice that it claims a redirection as occurred to http://www.exclaimer.com/oops.aspx?aspxerrorpath=/default.aspx

Now the original URL I plugged in http://www.exclaimer.com and http://www.exclaimer.com/default.aspx both work find in my browser. I keep a log of any pages which aren't found and indeed /default.aspx is in there over a thousand times (only change has happened 24 hours ago). This wasn't me trying the Page Speed online tool 1000 times so I worry that this may be another Google service (or some other automated system) which is failing. There have been no complaints from visitors to the site unable to get access which leads me to believe that for ordinary users there are no problems, the issue only comes from automated bots or similar.

I guess my question is, does anyone know of a way I can isolate the source of the problem? I attempted to modify my 404 logging code to capture the page from which /default.aspx was being accessed but not had much luck here as Url Referrer only works under pretty specific conditions.

Update

I have modified my code to log the error details but nothing is being passed through for /default.aspx.

Exception error = Server.GetLastError();
string errorTitle = "";
string errorDetails = "";
if (error != null)
{
    errorTitle = error.InnerException.Message;
    errorDetails = error.ToString();
}            
Server.ClearError();

... send to database
Foi útil?

Solução

If the page is redirecting to the error page then there must be an error occuring when that page is being accessed so what you want to do is try to capture what that error is to find what part of your code is causing you problems.

My guess would be that the problem is that you are assuming a certain http header to be sent from the client and you are not doing a null check on it. When you get a request from a robot which may not be sending accepted languages or something then you get a crash.

In your global error handler you shouldlog whatever exception is getting thrown either in a database or just by dumping it straight to a file. This is useful information at all times and should be captured in the event of any other errors on the site and tracking down their cause.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top