What is procedure for stopping robots and malicious scanners that slow down a site?

StackOverflow https://stackoverflow.com/questions/2542242

  •  23-09-2019
  •  | 
  •  

Question

What should i do to prevent users from running scanners or auto posting robots against my site that would slow down the site processing?

Is it sufficient to timestamp each post a user makes and create a posting delay? How long of an interval should there be?

What else can I do besides te above and captchas on form posts?

thanks

Was it helpful?

Solution

A time interval is a good idea and is used on Stack Overflow. Different operations should have different time limits depending on:

  1. How often ordinary users are likely to want to use that feature.
  2. How intensive the operation is.

If you have an operation that requires a lot of processing time, you might want to set the limit on that operation higher than for a relatively simple operation.

Stack Overflow combines time limits with CAPTCHAs for editing posts. If you edit too frequently you have to pass a CAPTCHA test.

OTHER TIPS

I googled this a year ago or so and found a list of known "bad useragents", which I added to my .htaccess to block those from accessing my blog. this small change had a significant impact on my bandwidth usage.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top