Question

We have some logic to calculate an expensive value per user for our ASP.NET Web Forms application. Currently it sits in the Page_Load header user control that is on every page like this:

//note that we are not awaiting this
Task.Factory.StartNew(() => CacheManager.GetExpensiveValue(UserId));

And then in the static CacheManager.GetExpensiveValue(int userID):

private static object locker = new object();

lock (locker)
{
    if (!AlreadyDone(userID))
    {
        var expensiveValue = ReallyExpensiveMethod(userID);
        //our static cache wrapper class that uses an ObjectCache object
        OurCache.Add(userID, expensiveValue);
    }
    else
    {
        return OurCache.Get(userID);
    }
}

This works, but when ReallyExpensiveMethod() takes a REALLY long time (I'm also working on improving the performance of the logic behind that), users will block on that lock when navigating between pages.

My question is, how could I restructure this to not cause blocking? I've thought about utilizing a ConcurrentDictionary with the values in the dictionary being Task wrappers for ReallyExpensiveMethod() and the keys being the UserID to prevent duplicate work, but I'm unsure if that would actually get me anywhere.

We currently are not using any asynchronous logic in this app, and I'm sure the powers-that-be would rather not introduce a change that would require adding Async="true" to every single page in the app since this header logic is in every page.

Was it helpful?

Solution

My question is, how could I restructure this to not cause blocking? ... rather not introduce [asynchrony]

You're between a rock and a hard place, there. Any request has to block or asynchronously wait for the process to complete; there's no other option, unless you can use something like SignalR to send the process results to the client side (but that would probably require significant architectural changes).

That said, you can certainly minimize the effect of the lock; it is currently blocking gets for other users if one user is doing the process.

I'm assuming that this calculation is pure (causes no side effects), and that the cache is an in-process, in-memory cache.

In that case, I would cache the task instead of the result. While I'm not wild about parallel processing on ASP.NET, I suppose this would be OK.

I do recommend you use the cache. ConcurrentDictionary has similar logic, but no easy way to flush old entries.

So, something like this:

// In Page_Load
CacheManager.GetOrAdd(UserID);

Task<Results> CacheManager.GetOrAdd(int userId)
{
  lock (locker)
  {
    if (!OurCache.Contains(userId))
    {
      var task = Task.Run(() => ReallyExpensiveMethod(userId));
      OurCache.Add(userId, task);
      return task;
    }
    else
      return OurCache.Get(userId);
  }
}

// Usage:
Results results = CacheManager.GetOrAdd(UserID).Result;

I'm not wild about the blocking (calling Task<T>.Result on the last line), but since you don't want to do asynchronous requests you're stuck with that kind of hack.

This code minimizes the time the lock is held. Instead of locking it for the duration of the processing, it is only locked long enough to start the processing on another thread and update the cache.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top