My question is, how could I restructure this to not cause blocking? ... rather not introduce [asynchrony]
You're between a rock and a hard place, there. Any request has to block or asynchronously wait for the process to complete; there's no other option, unless you can use something like SignalR to send the process results to the client side (but that would probably require significant architectural changes).
That said, you can certainly minimize the effect of the lock
; it is currently blocking gets for other users if one user is doing the process.
I'm assuming that this calculation is pure (causes no side effects), and that the cache is an in-process, in-memory cache.
In that case, I would cache the task instead of the result. While I'm not wild about parallel processing on ASP.NET, I suppose this would be OK.
I do recommend you use the cache. ConcurrentDictionary
has similar logic, but no easy way to flush old entries.
So, something like this:
// In Page_Load
CacheManager.GetOrAdd(UserID);
Task<Results> CacheManager.GetOrAdd(int userId)
{
lock (locker)
{
if (!OurCache.Contains(userId))
{
var task = Task.Run(() => ReallyExpensiveMethod(userId));
OurCache.Add(userId, task);
return task;
}
else
return OurCache.Get(userId);
}
}
// Usage:
Results results = CacheManager.GetOrAdd(UserID).Result;
I'm not wild about the blocking (calling Task<T>.Result
on the last line), but since you don't want to do asynchronous requests you're stuck with that kind of hack.
This code minimizes the time the lock is held. Instead of locking it for the duration of the processing, it is only locked long enough to start the processing on another thread and update the cache.