LRU Cache uses Least Recently Used concept to maintain the Cache order. It is an important concept to be learned for interview preparation. Here it is implemented using c++
LRU Cache uses Least Recently Used concept to maintain the Cache order. A class is created to store the variable and methods. Initially, instruction size is inputted, then cache capacity is inputted, which will determine the cache's size followed by the query size.
Two queries are available, namely [set key value] and [get key]. Check the screenshot below for a better understanding.
If the get or the set method is called with a key, it goes to the front of the List as per the LRU concept. If the list's size exceeds the Cache's capacity, the back of the list is deleted.
list<pair<int, int>> l ---- Used to store the cache keys and values.
unordered_map<int, list<pair<int, int>>::iterator> mp ---- Used to store the iterator for faster access in O(1) time complexity.
int Capacity ---- To store the capacity of the cache.
void set(int key, int value) ---- will set the key-value pair to the front of the List. If the key is already present in the List, then it is moved to the front with the new value.
int get(int key) ---- Will get the value from the key-value pair if present, from the list, and the key which was visited will be moved to the front of the list.