Question

I am of a firm belief that throwing hardware to solve software problems isn't the best policy. So when noticed several memory issues with one of our servers (currently running with 2 gigs), I tracked it down to the use of System.Web.HttpRuntime.Cache. While for a couple of sites, this made sense, throwing 50 sites that all use System.Web.HttpRuntime.Cache started bringing down the walls.

Without the option of an external caching server I am considering to modify the code to use either static classes or singletons for global data retention (the other option is to make additional db requests).

I'm not entirely clear on if this will have any change, as the data is still "in memory", and we may just need to throw more memory on the server.

Is there significantly more overhead in using System.Web.HttpRuntime.Cache over a singleton or static classes, and what are some recommended approaches to solve this issue?

-- Update --

In monitoring the Current file cache memory usage, I noticed this number spike as I hit some sites on a same application pool. This number jumped up to a little 1,000,000 (bytes I am assuming). I notice that this number eventually begins to decrease as the number of Active Flushed Entries increases then decreases.

How can I flush this out quicker, as problems seems to start when when this number is high on multiple app pools?

Instead of just ripping out the caching (which, as suggested, is probably not the best idea), simply setting a quicker expiry time for the cached objects may render better results?

Was it helpful?

Solution

What's it going to cost to change the code vs buy more memory?

I'd say a windows web-server with just 2GB of RAM is underspec'd.

You need to look at how long items are staying in the cache, how frequently they are used etc. As if you expire them too soon you're potentially pushing the bottleneck further down the stack e.g. onto the filesystem or the DB, which are both slower than RAM.

I'd add more RAM as a first step, it's the cheapest option, and then track down the performance issue to see whether there's optimisations to be had.

OTHER TIPS

Looking through the code on these sites, I found the following:

                    HttpRuntime.Cache.Insert(
                        /* key */                "WebsiteConfiguration",
                        /* value */              website,
                        /* dependencies */       null,
                        /* absoluteExpiration */ Cache.NoAbsoluteExpiration,
                        /* slidingExpiration */  Cache.NoSlidingExpiration,
                        /* priority */           CacheItemPriority.NotRemovable,
                        /* onRemoveCallback */   null); 

I think that a major issue may be with the NoSlidingExpiration and NotRemovable.

Now, if we set a 30 second cache timeout, this may resolve this issue:

if (System.Web.HttpRuntime.Cache["WebsiteConfiguration"] == null)
.
.
.

                    HttpRuntime.Cache.Insert(
                        /* key */                "WebsiteConfiguration",
                        /* value */              website,
                        /* dependencies */       null,
                        /* absoluteExpiration */ Cache.NoAbsoluteExpiration,
                        /* slidingExpiration */  new TimeSpan(0,0,30),               // zero timespan or not long enough will cause a null ref exception when used
                        /* priority */           CacheItemPriority.Normal,
                        /* onRemoveCallback */   null);     

... I have yet to confirm this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top