Question

I was recently looking into a few javascript design patterns and came across memoization while it looks like a good solution to avoid recalculation of values i can see something wrong with it. say for example this simple code,

function square(num)
{
  var result;
  if(!square.cache[num]){
     console.log("calculating a fresh value...");
     result = num*num;
     square.cache[num] = result;
  }
  return square.cache[num];
}
square.cache={}

calling console.log(square(20)) //prints out "calculating a fresh value..." and then the result of the calculation 400,

My real question is what happens when the cache grows so large after subsequent calculations that it takes more time to retrieve a result from the cache than it takes to calculate a fresh value. is there any solution towards this?

Was it helpful?

Solution

My real question is what happens when the cache grows so large

This is where you would implement a sort of Garbage Collection. Items could be removed from the cache following a Cache Algorithm.

So for example following Least Recently Used you would record how many times a specific object was used when it was last accessed and remove those from the cache that were not used recently.

Edit

Soundcloud use an object store and a very interesting read is this article on how they built their webapp.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top