Вопрос

I have added a 3 line code at the top of the index page to eliminate the bad request which occurs in imac devices. The code is given below:

<?php
if (preg_match ("/CaptiveNetworkSupport/", $_SERVER["HTTP_USER_AGENT"])) {
header ("HTTP/1.0 400 Bad Request");
exit ();
}
?>

But I can see lots of "PHP Notice: Undefined index: HTTP_USER_AGENT in " errors in the error_log. How can I remove these logs?

Thanks!

Это было полезно?

Решение

Maybe if you test if the index exists before the preg_match:

    <?php
        if ( isset($_SERVER["HTTP_USER_AGENT"]) && preg_match ("/CaptiveNetworkSupport/", $_SERVER["HTTP_USER_AGENT"]) )
        {
            header ("HTTP/1.0 400 Bad Request");
            exit ();
        }
    ?>

Другие советы

You could make sure it's set first:

<?php
    if (isset($_SERVER["HTTP_USER_AGENT"]) && preg_match ("/CaptiveNetworkSupport/", $_SERVER["HTTP_USER_AGENT"])) {
        header ("HTTP/1.0 400 Bad Request");
        exit ();
    }
?>

Header redirects based on user agent detection can get out of control quickly. Quite a few of the sites I've inherited over the years have massive .htaccess files that slow down the server connections. For future proofing you might start with an array of searches to check against (sort of like a blacklist). I usually store mine in a database. Here's an example of what works for user_agents and ip_addresses using the same function.

<?php
//user agent blacklist array
$uab = array(
'!^$!','!CaptiveNetworkSupport!','!curl!i'
);

//User agent handling array
$uah = array(
'HTTP/1.0 400 Bad Request','HTTP/1.0 400 Bad Request','HTTP/1.0 403 Forbidden',
);

$ipb = array(
'!^1\.2\.3!',
'!^8\.8\.8\.8!',
'!^199\.21!',
'!^0\.0\.0\.0$!',
);

$iph = array(
'Location: /no-bots.php',
'Location: /warning.php',
'HTTP/1.0 403 Forbidden',
'HTTP/1.0 403 Forbidden',
);


function blackList($blacklist,$handler,$means){

    foreach($blacklist as $key=>$bl){
        if (preg_match($bl,$means)){
            header($handler[$key]);
            exit();
        }
    }
}

if(!empty($_SERVER['HTTP_USER_AGENT'])){
  $user_agent = $_SERVER['HTTP_USER_AGENT'];    
} else {
  $user_agent = ''; 
}

blackList($uab,$uah,$user_agent);

//This should always be set!
if(!empty($_SERVER['REMOTE_ADDR'])){
  $ip_address = $_SERVER['REMOTE_ADDR'];    
} else {
    $ip_address = '0.0.0.0';
}

blackList($ipb,$iph,$ip_address);

?>

In my custom CMS I have a ban page that allows a backend user a means for determining how the offender gets punished or treated. It's useful for blocking all sorts of things and is best when used in conjunction with iptables.

Use isset($_SERVER['HTTP_USER_AGENT']) or array_key_exists( 'HTTP_USER_AGENT', $_SERVER ) to check that the key HTTP_USER_AGENT even exists.

When you refer to a key that doesn't exist you get a notice. you can suppress the notices globally in the config but it's a very bad practice.

You can suppress all errors for a single call with the @. eg.: @$_SERVER['HTTP_USER_AGENT']

(sorry for bad English)

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top