I am learning a couple of LRU/LFU cache implementation. In a couple of implementations, I saw the combination of deque, double-linked list and hashmap data structure, the idea is like:
in LRU cache:
use hashmap to achieve a O(1) lookup.
use deque to maintain order by access.
in LFU cache:
use hashmap to achieve a O(1) lookup.
use double-linked list to maintain frequency
See: http://javarticles.com/2012/06/lfu-cache.html
http://www.lintcode.com/en/problem/lru-cache/#
It fascinated me that why such a data structure is chosen for this problem.
Here is a thought, for LRU cache, the current and next “order” are easily maintainable by deque, in the case of LRU, you essentially need to:
1) remove head (when cache expires)
2) delete an element, and append it to the end. (when an item is accessed).
For LFU cache, you need:
1) remove head (when cache expires)
2) move the item the next frequency slot ( when an item is accessed).
The data structures are chosen because they fit a specific problem.
Now, I think there is a use case that’s a combination of LFU/LRU, the question is like:
How to get top 10 most traded stocks from continuous feed of stock trade in last hour?
So we can:
Use Deque to maintain the access order
Use Double-linked-list to maintain the frequency order
Use HashMap for fast lookup.
Leave a comment