Question

I have a web site that reports about each non-expected server side error on my email.

Quite often (once each 1-2 weeks) somebody launches automated tools that bombard the web site with a ton of different URLs:

  • sometimes they (hackers?) think my site has inside phpmyadmin hosted and they try to access vulnerable (i believe) php-pages...
  • sometimes they are trying to access pages that are really absent but belongs to popular CMSs
  • last time they tried to inject wrong ViewState...

It is clearly not search engine spiders as 100% of requests that generated errors are requests to invalid pages.

Right now they didn't do too much harm, the only one is that I need to delete a ton of server error emails (200-300)... But at some point they could probably find something.

I'm really tired of that and looking for the solution that will block such 'spiders'.

Is there anything ready to use? Any tool, dlls, etc... Or I should implement something myself?

In the 2nd case: could you please recommend the approach to implement? Should I limit amount of requests from IP per second (let's say not more than 5 requests per second and not more then 20 per minute)?

P.S. Right now my web site is written using ASP.NET 4.0.

Was it helpful?

Solution 2

There are a couple of things what you can consider...

You can use one of the available Web Application Firewalls. It usually has set of rules and analytic engine that determine suspicious activities and react accordingly. For example in you case it can automatically block attempts to scan you site as it recognize it as a attack pattern.

More simple (but not 100% solution) approach is check referer url (referer url description in wiki) and if request was originating not from one of you page you rejected it (you probably should create httpmodule for that purpose).

And of cause you want to be sure that you site address all known security issues from OWASP TOP 10 list (OWASP TOP 10). You can find very comprehensive description how to do it for asp.net here (owasp top 10 for .net book in pdf), i also recommend to read the blog of the author of the aforementioned book: http://www.troyhunt.com/

OTHER TIPS

Such bots are not likely to find any vulnerabilities in your system, if you just keep the server and software updated. They are generally just looking for low hanging fruit, i.e. systems that are not updated to fix known vulnerabilities.

You could make a bot trap to minimise such traffic. As soon as someone tries to access one of those non-existant pages that you know of, you could stop all requests from that IP address with the same browser string, for a while.

Theres nothing you can do (reliabily) to prevent vulernability scanning, the only thing to do really is to make sure you are on top of any vulnerabilities and prevent vulernability exploitation.

If youre site is only used by a select few and in constant locations you could maybe use an IP restriction

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top