문제

Possible Duplicate:
LRU cache design

I got this question in a programming interview. Feel free to think about how you might answer it.

How would you implement an LRU (least-recently-updated) cache in C++? Basically, the cache can hold up to N items. If a new item is inserted and the number of items in the cache is less than N, it's just inserted. But if a new item is inserted and the number of items in the cache is already N, the item that's least recently used should be removed from the cache.

Think about what running time it would take for each of your operation.

도움이 되었습니까?

해결책

I would have member of cache element that stores last access time. When element is accessed (member function is called) then access time member is updated. When cache gets full then element with smallest access time would be erased.

Other option is to have cache elements in intrusive list. When something is accessed and is not in top of list it comes to top of list. When cache is full the bottom element of list is erased. More work on each access, but the victim is quicker to find.

The basic idea is not to have typical maps and lists for such tasks, these will fragment your memory. My algorithms keep the cache constantly at one place.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top