Question

Hello in my site I have a core folder, in which there is sensitive data, in which I have a htaccess file with 'Deny From All'. My question is can Google or some other crawler access that directory even when they know I don't want anyone there?

Was it helpful?

Solution

Technically, no, they cannot access it. But by placing the folder there (under a public directory) you expose yourself to a higher risk, like if you make a mistake and overwrite your .htaccess , or the server gets updated and your rules became ineffective.

You'd be better to move that folder outside of your public_html (or equivalent). What I usually do is to create a private_files right beside the public_html folder, and any files that I need there can be referenced from (secure!) scripts.

OTHER TIPS

No they can't. Bots can choose whether to follow instructions from robots.txt and such but they cannot skip what the server is throwing at them. In other words, the server will refuse to display that information and there's nothing ( Well supposedly ) that the bot can do about this. Good luck :P

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top