Question

Open the logs of a dozen Drupal sites and you will see some patterns for non-Drupal packages in the http requests that received a 404 response. Requests like /postnuke/article.php, /exchange/logon.asp, /awstats.pl, /wp-content/, /mailman/, /phpBB/page_header.php, etc, etc.

Nearly 100% of these are bots scanning for exploits. I know there are several modules to block these requests based on IP, http header, or some other key piece of data.

I've been mulling over the idea of writing a module that approaches this differently.

Instead of trying to block the request, just cache the response. For sites using a reverse-proxy layer cache like Varnish, set the max-age for a ridiculously long time (1 year). The module would simply generate menu entries for common packages that a Drupal site is very unlikely to have installed. I would include the option to exclude a specific package so if you really did want to run Drupal and phpBB in the same web root, you could... but may require you to install https://www.drupal.org/project/bad_judgement to do that :)

I realize that this type of configuration can also be done at the .htaccess level, but maintaining that for all of the packages bots are trying to exploit is beyond the skill set of many people.

Does something like this already exist?

Am I missing some obvious reason this wouldn't work. It seems like this would improve performance of a site by simply never letting a 2nd request hit the php/mysql level for another year (or until you cleared the Varnish cache)?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with drupal.stackexchange
scroll top