Question

I have a Perl-based CGI/Fast CGI web service and want to rate-limit clients by IP address to stop aggressive clients causing too much work.

I have looked around for some code and found Algorithm::TokenBucket in CPAN but that is for client requests; it has no persistence and has no per-user config so is not really useful for server-side rate limiting.

I am looking for suggestions for something that already exists, otherwise I'll need to roll my own based on some simple persistence such as tie to DB_File per-IP address and some batch job that does the token management.

Was it helpful?

Solution

I've used Cache::FastMmap for rate-limiting by tracking hits per IP address. It's a cache so data will expire over time, but if you set the size and expire time right, this shouldn't be an issue.

The IP address is the hash key and the hash value is an array of timestamps. I have a second data structure (also backed by Cache::FastMMap) which is a hash of banned IP addresses, updated according to the data from the first structure.

OTHER TIPS

I know it's not what you asked, but have you considered handling this elsewhere in the stack where it's already been done for you? Clearly I don't know your deployment stack, but if it's apache you could use mod_evasive. Alternately if you're on Linux you could let iptables do its job using something like:

#Allow only 12 connections per IP
/sbin/iptables -A INPUT -p tcp --dport 80 -m conn-limit --connlimit-above 12 -j REJECT --reject-with tcp-reset

certainly more complicated rules are possible.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top