Question

I have a Cakephp website and when I look it up in Google it shows an error in the description even though the site is working well.

The error shown:

$status = "Location: http://mywebsite.com/" header - [internal], line ?? 
Controller::header() - COREcake/libs/controller/controller.php, line 742     
Controller::redirect() ...

I Googled "Searched the internet" for the error and found that several CakePHP websites have the same problem, they work fine but there is an error in their Google description. Meta tags are displayed correctly in the source page.

Does anybody know what wrong?

I have set the debug to zero, uploaded a site map, robots.txt file and still suffer from the same problem even Bing and Yahoo are showing the same problem now.

If any body can give me a hand that would be really appreciated.

Was it helpful?

Solution 3

I figured it out a week ago. I used a third-party library for browser detection since I did not feel like re-inviting the wheel. When I went though the code I noticed that the person who created it handled every browser possible. He had a series of if-elseif statements but no else statement.

if(IE){
    do this
}
elseif(Firefox)
{
  do that
}
... for all the browsers

the problem took place when the crawler visits the page and since the conditions can't find a browser header it was outputting an error message.

to solve the problem I just added an else condition and considered every crawler as a Firefox browser.

OTHER TIPS

  1. You should always set debug to zero for sites that may be indexed, or risk getting errors logged (though, you could set debug to 1 just for your IP).
  2. Now that your site has been indexed with an error in the meta description, first check if the error still exists and fix it. It may already have been solved, but still be in a cache at google. After that, have google reindex your site (have a look at the Google Web Master Programme, it's quite helpful). Also consider some basic SEO stuff such as sitemap.xml if you haven't already.

What happens when you set your browser's UserAgent string to the same as GoogleBot's and browser your site with cookies disabled? If you have any server-side logic depending on cookie values or UserAgent, it is very likely that you are not getting an error but Google's crawler is.

Is the search result linking to a page that should only be accessible to logged-in users? If your Cake app is trying to look up data based on a nonexistent logged-in user identity, this would cause problems.

Oh, and SET DEBUG TO ZERO!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top