Question

When using the HttpRuntime.Cache in an ASP.NET application, any item retrieved from the cache that is then updated will result in the cached object being updated too (by reference). Subsequent reads from the cache will get the updated value, which may not be desirable.

There are multiple posts on this subject, for example:

Read HttpRuntime.Cache item as read-only

And the suggested solution is to create a deep-copy clone using binary serialization.

The problem with binary serialization is that it's slow (incredibly slow), and I can't afford any potential performance bottlenecks. I have looked at deep-copying using reflection and whilst this appears to be better performing, it's not trivial to implement with our complex DTOs. Anyone interested in this may want to have a look at the following brief article:

Fast Deep Cloning

Does anyone have any experience of caching solutions such as AppFrabric / NCache etc and know whether they would solve this problem directly?

Thanks in advance

Griff

Was it helpful?

Solution

Products like NCache and AppFabric also perform serialization before storing the object in an out-of-process caching service. So you'd still take that serialization hit, plus you'd get slowed down even further by going out-of-process (or maybe even over the network) to access the serialized object in the caching service.

Implementing ICloneable on your classes to perform hand-tuned deep copies will avoid reflection and will outperform binary serialization, but this may not be practical if your DTOs are very complex.

Updated to provide specifics:

AppFabric usese the NetDataContractSerializer for serialization (as described here). The NetDataContractSerializer can be a little faster than the BinaryFormatter, but its performance is usually in the same ballpark: http://blogs.msdn.com/b/youssefm/archive/2009/07/10/comparing-the-performance-of-net-serializers.aspx

NCache rolled their own serializer, called "Compact Serialization". You need to either implement their ICompactSerializable interface on your DTO classes and read/write all members by hand, or else let their client libraries examine your class and then emit its own serialization code at runtime to do that work for you (it's a one-time hit when your app starts up, where they have to reflect over your class and emit their own MSIL). I don't have data on their performance, but it's safe to assume that it's faster than serializers that perform reflection (BinaryFormatter/DataContractSerializer) and probably somewhere in the same performance realm as protobuf, MessagePack, and other serializers that avoid excessive reflection. More detail is here.

(I work for a company (ScaleOut Software) that's in the same space as NCache, so I should probably know more about how they do things. ScaleOut lets you plug in whatever serializer you want--we usually recommend Protobuf-net or MessagePack, since they're generally considered to be the reigning champions for .NET serialization performance--definitely take a close look at those two if you decide to use a serializer to make your deep copies.)

OTHER TIPS

Most cache frameworks rely on serialization.

You should consider invalidating the cache each time you change an object.

For example:

object Get(string key) 
{ 
    if (!Cache.Contains(key))
    {
        Cache[key] = GetDataFromDatabase(key);
    }
    return Cache[key];
}

void Invalidate(string key)
{
    Cache.Remove(key);
}

So you can do:

var myDto = Get(id);
myDto.SomeProperty = "changed";
myDto.Save();
Invalidate(id);
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top