Header redirects based on user agent detection can get out of control quickly. Quite a few of the sites I've inherited over the years have massive .htaccess
files that slow down the server connections. For future proofing you might start with an array of searches to check against (sort of like a blacklist). I usually store mine in a database. Here's an example of what works for user_agents and ip_addresses using the same function.
<?php
//user agent blacklist array
$uab = array(
'!^$!','!CaptiveNetworkSupport!','!curl!i'
);
//User agent handling array
$uah = array(
'HTTP/1.0 400 Bad Request','HTTP/1.0 400 Bad Request','HTTP/1.0 403 Forbidden',
);
$ipb = array(
'!^1\.2\.3!',
'!^8\.8\.8\.8!',
'!^199\.21!',
'!^0\.0\.0\.0$!',
);
$iph = array(
'Location: /no-bots.php',
'Location: /warning.php',
'HTTP/1.0 403 Forbidden',
'HTTP/1.0 403 Forbidden',
);
function blackList($blacklist,$handler,$means){
foreach($blacklist as $key=>$bl){
if (preg_match($bl,$means)){
header($handler[$key]);
exit();
}
}
}
if(!empty($_SERVER['HTTP_USER_AGENT'])){
$user_agent = $_SERVER['HTTP_USER_AGENT'];
} else {
$user_agent = '';
}
blackList($uab,$uah,$user_agent);
//This should always be set!
if(!empty($_SERVER['REMOTE_ADDR'])){
$ip_address = $_SERVER['REMOTE_ADDR'];
} else {
$ip_address = '0.0.0.0';
}
blackList($ipb,$iph,$ip_address);
?>
In my custom CMS I have a ban page that allows a backend user a means for determining how the offender gets punished or treated. It's useful for blocking all sorts of things and is best when used in conjunction with iptables.