Question

I have a link detection in my site which turns links input by users to anchors but I want to limit conversion of crap links that don't exist and I have built the following

public function tLink($s){
    $domain = preg_replace('/(http|ftp)+(s)?:(\/\/)((\w|\.)+)(\/)?(\S+)?/i', '\4', $s);
    getmxrr($domain,$result);
    if(!empty($result)){
        return preg_replace('/(http|ftp)+(s)?:(\/\/)((\w|\.)+)(\/)?(\S+)?/i', '<a href="\0" title="\0">\4</a>', $s);
    }
    return $s;
}

But this makes my pages load really slow, takes anywhere from 2 to 5 seconds to load. ( they used to load instantly ). Is there a better method I can use?

Was it helpful?

Solution

You can cache the output of getmxrr to ensure your not doing the same operation for already validated domains.

assuming you have configured and install memcache. You can replace your lookup with this function

function domain_found($domain) { 
$memcache_obj = new Memcache;
$memcache_obj->connect('localhost', 11211);
$var = $memcache_obj->get($domain);
if ($var == "found") return true;
if ($var == "notfound") return false;

    getmxrr($domain,$result); 
    if (empty($result)) {
      $memcache_obj->put($domain, 'notfound');
      return false;
    } else {
      $memcache_obj->put($domain, 'found');
      return true;
    }   
}

You can ofcourse work this inside a class if you like, and optimize other value elements. This serves as a proof of concept code.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top