문제

Let's suppose that we have 10K requests per second to our php script.

Every request is checking cache in memcached (or any other cache storage). If cache is found - everything is ok and cache value is returned. If cache is not found we're making a slow SQL query to fill the cache. It is the most common and simple caching scheme:

$result = $this->loadFromCache($key);
if (empty($result)) {
    $result = $this->makeSlowSqlQuery();
    $this->writeToCache($key, $result);
}
//do something with $result;

This scheme works good until we don't have too much requests. As soon as we have too much requests we will face situation when big number of requests will not found anything in cache and will try to refill it. So all of them will start executing slow SQL query and it will cause high load impact. What is the solution?

As possible solution I see the following scenario: first request that found cache invalid should create some trigger saying that cache refilling is already started and another request just should wait for new cache or use older (previous) version.

How do you solve similar problems?

도움이 되었습니까?

해결책

What you essentially want is a lock pattern:

$lockPrefix = "!lock__";
$result = $this->loadFromCache($key);
if (empty($result)) {
     $sleepLimit = 2000; // 2s timeout
     $sleepCount = 0;
     $cacheBlocked = 0;
     while ($this->loadFromCache($lockPrefix . $key) == 1) {
         // signal that something else is updating the cache
         $cacheBlocked = 1;
         // sleep for 1ms
         usleep(1000);
         // timeout logic...
         $sleepCount++
         if ($sleepCount == $sleepLimit) {
             die("Cache read timeout.");
         }
     }
     if ($cacheBlocked == 1) {
         // something else updated the cache while we were waiting
         // so we can just read that result now
         $result = $this->loadFromCache($key);
     } else {
         $this->writeToCache($lockPrefix . $key, 1); // lock
         $result = $this->makeSlowSqlQuery();
         $this->writeToCache($key, $result);
         $this->writeToCache($lockPrefix . $key, 0); // release
     }
}

The idea is that the cache is global, so can be used to hold a lock pattern across requests. You're essentially creating a mutex across the cache entry with an added bit of logic to ensure that only one slow query is kicked off.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top