Question

I am a newbie with PHP and therefore this is more of a conceptual question or maybe even a question about 'best practices'.

Often, I see websites with stats drawn from their database. For example, let's say it is a sales lead website. It may have stats at the top of the page like:

NEW SALES LEADS YESTERDAY: 123

NEW SALES LEADS THIS MONTH: 556

NEW SALES LEADS THIS YEAR: 3870

Obviously, this should not be calculated everytime the page is displayed, right? That would potentially be a large burden on the server? How do people cache this type of data. Any best practices? I thought I writing a CRON jobs that would calculate it on a daily basis and insert to a database. What are your ideas? Thank you!

Was it helpful?

Solution

You can calculate it once and then store it in a xcache. Here, however there doesn't seem to be a need for a cron. The query can run one time and store the result in xcache. Important thing here would be to set the expiration time of the stored value according to your use case. For eg. if you need to store daily stats like above, set the expiration time to be a few hours. In case of data which gets updated every minute, you can set the expiration time to be a few minutes.

Something like this.

$newSalesLeadYest;
if(xcache_isset("newSalesLeadYest")){
    $newSalesLeadYest = xcache_get("newSalesLeadYest");
} else{
    $newSalesLeadYest = runQueryToFetchStat();
    //Cache set for X secs
    xcache_set("newSalesLeadYest", $newSalesLeadYest, X);
}

OTHER TIPS

What you need is to come up with a caching strategy. Some factors to help you decide:

  • How frequent does the data change?
  • How important is the current values - is it ok if it's 1min, 1hr, 1day old?
  • How expensive, time wise, is loading fresh data?
  • How much traffic are you getting? 10s, 100s, millions?

There are a few ways you can achieve the result.

  • You can use something like memcached to persist the data to avoid it being generated each request.
  • You can use http caching and load the data client side using javascript from an api.
  • You can have a background worker (eg. run by cron), which generates the latest figures and persists to a lookup database table.
  • You could improve the queries and indexes so that getting live data is fast enough to do every request
  • You could alter you database schema so that you have more static data

From the 3 examples you gave, 3 simple counts should not be expensive enough to warrant complex caching systems. If you can paste the sql queries, we can help optimise them.

The data sounds like it will only get updated once per day, so a simple nightly cron "flatten" query would be a nice fit.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top