Thursday, February 23, 2012

MRU cache.


Imagine you have data being pulled very frequently from a large database. How would you design a MRU (most recently used) cache?
Implement getValue(int id) for the MRU cache.


Approach:
Most Recently Used (MRU): discards, in contrast to LRU, the most recently used items first.
For random access patterns and repeated scans over large datasets (sometimes known as cyclic access patterns) MRU cache algorithms have more hits than LRU due to their tendency to retain older data.[4] MRU algorithms are most useful in situations where the older an item is, the more likely it is to be accessed.


MRU Cache:
1. For every Data Value we will have an attribute timestamp associated with it.
2. As Cache is a hashtable, we will store the values in a double linked list, on access of a Key Ki, its value Vi's attribute timestamp would change and we would put that node in the front of doubly linked list. ie it would become head.
3. Now if our predecided capacity is reached we would evict the head node of the linked list - since thats our MRU node.


Once we have the datastructure (like LinkedHashMap in java) ready then getValue(int id) is simply return of linkedHashMap.get(id);

No comments:

Post a Comment