LRU Cache

Updated on 06 June, 2025
LRU Cache header image

Problem Statement

The task is to design a Least Recently Used (LRU) cache which manages a specific number of items based on their recent usage. The LRUCache class should be able to:

  • Initialize the cache with a given positive capacity, which limits the number of entries it can hold.
  • Retrieve a value (get operation) which returns the value for a given key if it exists within the cache; if it is not present, it should return -1.
  • Add or update a value (put operation) for a specific key. If adding this item would exceed the cache's capacity, the least recently used item is discarded to make space for the new item.

To ensure efficiency, both get and put operations must run with an average time complexity of O(1).

Examples

Example 1

Input:

["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]

Output:

[null, null, null, 1, null, -1, null, -1, 3, 4]

Explanation:

LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1);    // return 1
lRUCache.put(3, 3); // evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2);    // returns -1 (not found)
lRUCache.put(4, 4); // evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1);    // return -1 (not found)
lRUCache.get(3);    // return 3
lRUCache.get(4);    // return 4

Constraints

  • 1 <= capacity <= 3000
  • 0 <= key <= 10^4
  • 0 <= value <= 10^5
  • At most 2 * 10^5 calls will be made to get and put.

Approach and Intuition

The implementation of the LRU cache will use:

  1. Hash Map: Maps keys to nodes in constant time.
  2. Doubly Linked List: Maintains the usage order (most recently used at the tail, least at the head).

Core Logic:

  • When get(key) is called:

    • If key exists, move the node to the end (most recently used) and return its value.
    • If not, return -1.
  • When put(key, value) is called:

    • If key exists, update value and move node to the end.

    • If not:

      • If capacity is full, evict the head (least recently used).
      • Insert the new key-value pair at the tail.

This approach ensures each operation runs in O(1) time using auxiliary structures to manage access and recency efficiently.

Solutions

  • C++
  • Java
  • C
  • JavaScript
  • Python
cpp
class LRUCache {
public:
    int max_capacity;
    unordered_map<int, list<pair<int, int>>::iterator> cache_map;
    list<pair<int, int>> cache_list;

    LRUCache(int capacity) { this->max_capacity = capacity; }

    int get(int key) {
        auto map_iter = cache_map.find(key);

        if (map_iter == cache_map.end()) {
            return -1;
        }

        int retrieved_value = map_iter->second->second;
        cache_list.erase(map_iter->second);
        cache_list.push_front({key, retrieved_value});

        cache_map.erase(map_iter);
        cache_map[key] = cache_list.begin();
        return retrieved_value;
    }

    void put(int key, int value) {
        auto map_iter = cache_map.find(key);

        if (map_iter != cache_map.end()) {
            cache_list.erase(map_iter->second);
            cache_map.erase(map_iter);
        }

        cache_list.push_front({key, value});
        cache_map[key] = cache_list.begin();

        if (cache_map.size() > max_capacity) {
            auto last = cache_list.rbegin();
            cache_map.erase(last->first);
            cache_list.pop_back();
        }
    }
};

/**
 * Your LRUCache object will be instantiated and called as such:
 * LRUCache* obj = new LRUCache(capacity);
 * int param_1 = obj->get(key);
 * obj->put(key,value);
 */

The provided C++ program implements an LRU Cache, which stands for Least Recently Used Cache. It efficiently manages the storage and retrieval of data to ensure that only the most recently accessed data remains in the cache when it reaches its capacity limit.

Here’s a breakdown of the implementation:

  • LRUCache Class: This class manages the LRU cache with two main data structures:

    • An unordered_map called cache_map that maps keys to iterators pointing to a list where actual key-value pairs are stored.
    • A list of pairs called cache_list which stores the key-value pairs. The most recently used items are kept at the front of the list.
  • Constructor (int capacity): Initializes the maximum capacity of the cache.

  • Function int get(int key):

    • Checks if the key exists in the cache. If not found, returns -1.
    • If found, it retrieves the value from the list, moves the key-value pair to the front of the list (denoting it as the most recently used), and returns the value.
  • Function void put(int key, int value):

    • If the key already exists in the cache, removes the old key-value pair.
    • Inserts the new key-value pair at the front of the list.
    • If the size of the cache exceeds the maximum capacity after the insertion, removes the least recently used item from the back of the list.

This LRU Cache implementation is optimized for both get and put operations, making use of C++'s unordered_map for fast lookups and list for maintaining order, which allows these operations to be executed in O(1) time complexity. This is ideal for scenarios that require fast access to a limited size of frequently updated data, such as in web servers caches or in applications where recent data is more significant.

java
class LRUCache {
    private int maxCapacity;
    private LinkedHashMap<Integer, Integer> cacheMap;

    public LRUCache(int capacity) {
        maxCapacity = capacity;
        cacheMap = new LinkedHashMap<>(capacity, 0.75f, true) {
            @Override
            protected boolean removeEldestEntry(
                Map.Entry<Integer, Integer> eldest
            ) {
                return size() > maxCapacity;
            }
        };
    }

    public int get(int key) {
        return cacheMap.getOrDefault(key, -1);
    }

    public void set(int key, int value) {
        cacheMap.put(key, value);
    }
}

The provided Java code implements a Least Recently Used (LRU) cache using the LinkedHashMap class. This data structure is particularly useful for maintaining a collection of items with the most infrequently accessed items removed when the cache exceeds its capacity limit.

Here's how the implementation works:

  • Initialization: The LRUCache class is initialized by specifying the maximum capacity of the cache. This is achieved by creating an instance of LinkedHashMap with an overridden removeEldestEntry method. This method ensures that the oldest (least recently accessed) entry is removed when the cache's size exceeds the specified maximum capacity.

  • Get Operation: The get(int key) method fetches the value associated with a specific key if it exists in the cache. If the key is not present, -1 is returned. Accessing an item updates its order in the LinkedHashMap, making it more recently accessed.

  • Set Operation: The set(int key, int value) method inserts or updates the value associated with the specified key in the cache. If the cache exceeds its capacity limit during this operation, the least recently used item is automatically removed.

Overall, this LRU Cache implementation efficiently manages data retrieval and storage, ensuring that the most recently used items are readily available, while the least used items are discarded when the cache space is needed. This makes it particularly useful for applications that require quick access to frequently accessed data while maintaining a fixed memory footprint.

c
typedef struct Element {
    int key;
    int value;
    struct Element* next;
    struct Element* prev;
} Element;

typedef struct {
    int max_size;
    int current_size;
    Element* first;
    Element* last;
    struct HashItem* entries;
} Cache;

typedef struct HashItem {
    int key_id;
    Element* link;
    UT_hash_handle hh;
} HashItem;

Cache* createCache(int capacity) {
    Cache* cache = (Cache*)malloc(sizeof(Cache));
    cache->max_size = capacity;
    cache->current_size = 0;
    cache->first = NULL;
    cache->last = NULL;
    cache->entries = NULL;
    return cache;
}

int getFromCache(Cache* cache, int key) {
    HashItem* temp;
    HASH_FIND_INT(cache->entries, &key, temp);

    if (temp == NULL) {
        return -1;
    }

    Element* element = temp->link;

    if (element != cache->first) {
        if (element == cache->last) {
            cache->last = element->prev;
        }

        element->prev->next = element->next;
        if (element->next) {
            element->next->prev = element->prev;
        }

        element->next = cache->first;
        element->prev = NULL;
        cache->first->prev = element;
        cache->first = element;
    }

    return element->value;
}

void putInCache(Cache* cache, int key, int value) {
    HashItem* found;
    HASH_FIND_INT(cache->entries, &key, found);

    if (found != NULL) {
        Element* element = found->link;
        element->value = value;

        if (element != cache->first) {
            if (element == cache->last) {
                cache->last = element->prev;
            }

            element->prev->next = element->next;
            if (element->next) {
                element->next->prev = element->prev;
            }

            element->next = cache->first;
            element->prev = NULL;
            cache->first->prev = element;
            cache->first = element;
        }
    } else {
        Element* new_element = (Element*)malloc(sizeof(Element));
        new_element->key = key;
        new_element->value = value;
        new_element->prev = NULL;
        new_element->next = cache->first;
        if (cache->first) {
            cache->first->prev = new_element;
        }
        cache->first = new_element;

        HashItem* entry = (HashItem*)malloc(sizeof(HashItem));
        entry->key_id = key;
        entry->link = new_element;
        HASH_ADD_INT(cache->entries, key_id, entry);

        if (cache->current_size == cache->max_size) {
            if (cache->last) {
                HashItem* temp;
                HASH_FIND_INT(cache->entries, &(cache->last->key), temp);
                HASH_DEL(cache->entries, temp);
                if (temp) free(temp);

                Element* last_element = cache->last;
                cache->last = last_element->prev;
                if (cache->last) {
                    cache->last->next = NULL;
                }

                if (last_element) free(last_element);
            }
        } else if (cache->current_size < cache->max_size) {
            if (cache->current_size == 0) {
                cache->last = new_element;
            }
            cache->current_size++;
        }
    }
}

void freeCache(Cache* cache) {
    Element* ele = cache->first;
    while (ele) {
        Element* next = ele->next;
        free(ele);
        ele = next;
    }

    HashItem *entry, *tmp_entry;
    HASH_ITER(hh, cache->entries, entry, tmp_entry) {
        HASH_DEL(cache->entries, entry);
        free(entry);
    }

    free(cache);
}

The provided C code implements an LRU (Least Recently Used) Cache using a combination of a doubly linked list and a hash table to ensure fast access and update times. Here’s a breakdown of how the LRU Cache operates and how it's managed in this implementation:

  • Data Structures Used:

    • Element: Represents an item in the doubly linked list, holding the key, value, and pointers to the next and previous elements.
    • Cache: Manages the overall structure, tracking the maximum size of the cache, current number of elements, pointers to the first and last elements of the doubly linked list, and an entries hash table for quick lookup.
    • HashItem: Facilitates hash table entries, linking keys to their respective elements in the doubly linked list.
  • Key Functions:

    1. createCache(int capacity): Initializes the Cache with a specified capacity. It sets up the empty linked list and hash table.
    2. getFromCache(Cache* cache, int key): Retrieves the value of the element identified by the key from the cache. If the element exists and is not at the front, it’s moved to the front to signify recent use. If the element is not found, returns -1.
    3. putInCache(Cache* cache, int key, int value): Inserts a new key-value pair into the cache or updates the existing key with a new value. If the cache is full, the least recently used element (at the end of the list) is removed. New or accessed elements are moved to the front to reflect recent use.
    4. freeCache(Cache* cache): Releases all allocated memory, cleaning up both the linked list and hash table elements.
  • Memory Management:

    • The code uses dynamic memory allocation (malloc) for creating new elements and hash items, and ensures proper deallocation (free) when elements are removed or the entire cache is destroyed to prevent memory leaks.
  • Efficiency Considerations:

    • Access (getFromCache) and update (putInCache) operations are designed to run in constant time (O(1)) due to the hash table providing immediate access to any element and the doubly linked list allowing quick rearrangement of elements.

This LRU Cache implementation is effective for scenarios requiring fast access to a set of frequently used data items where the least used items are discarded first when the maximum capacity is reached. The blend of a hash table and a doubly linked list ensures both quick access and efficient management of the items based on their usage.

js
// JavaScript
class CacheLRU {
    constructor(limit) {
        this.limit = limit;
        this.cacheMap = new Map();
    }
    fetch(key) {
        if (!this.cacheMap.has(key)) return -1;
        let val = this.cacheMap.get(key);
        this.cacheMap.delete(key);
        this.cacheMap.set(key, val);
        return val;
    }
    store(key, value) {
        if (this.cacheMap.has(key)) {
            this.cacheMap.delete(key);
        }
        this.cacheMap.set(key, value);
        if (this.cacheMap.size > this.limit) {
            this.cacheMap.delete(this.cacheMap.keys().next().value);
        }
    }
}
/**
 * Your CacheLRU object will be instantiated and called as such:
 * var obj = new CacheLRU(limit);
 * var param_1 = obj.fetch(key);
 * obj.store(key, value);
 */

Implementing an LRU Cache involves managing a set number of elements that get replaced based on the least recently used strategy. The solution provided demonstrates how to create and operate an LRU Cache in JavaScript.

  • Start by defining a CacheLRU class.
  • Initialize the LRU Cache with a specified limit which dictates the maximum number of entries the cache can hold.
  • Utilize a JavaScript Map to easily manage entries. Maps maintain the order of inserted keys, which is leveraged to track usage.
  • Implement the fetch method. If the cache contains the key, the method returns its associated value after moving this key-value pair (making it the most recently used), otherwise, returns -1 indicating the key is not present.
  • Implement the store method. Store the key-value pair into the map:
    • If the key is already present, first delete the existing entry (to update its recent usage by reinserting it later).
    • Insert the new key-value pair.
    • If the cache exceeds the defined limit after insertion, remove the oldest entry (the first one in the map's iterator, which corresponds to the least recently used item).

This approach ensures efficient management of cache size and quick access to the most recently used items. Follow these details while integrating the LRU Cache into projects that require frequent access to a limited number of elements where the least recently used items can be evicted.

python
import collections

class CacheLRU:
    def __init__(self, limit: int):
        self.limit = limit
        self.cache = collections.OrderedDict()

    def fetch(self, key: int) -> int:
        if key not in self.cache:
            return -1

        self.cache.move_to_end(key)
        return self.cache[key]

    def store(self, key: int, value: int) -> None:
        if key in self.cache:
            self.cache.move_to_end(key)
        
        self.cache[key] = value
        if len(self.cache) > self.limit:
            self.cache.popitem(False)

This Python program defines a class CacheLRU to implement an LRU (Least Recently Used) cache mechanism using collections.OrderedDict. Follow these steps to understand the operations of the cache:

  1. Initialize the cache with a specific size limit. The __init__ method accepts limit as an argument and sets it to self.limit. It also creates an OrderedDict to store cache items.

  2. Add the fetch method to retrieve the value of a key. If the key is not found in the cache, it returns -1. If the key exists, it moves the key to the end of the cache (making it the most recently used) and then returns its value.

  3. Implement the store method to insert or update a key-value pair in the cache:

    • If the key is already in the cache, it moves the key to the end to mark it as most recently used.
    • It then adds or updates the key-value pair in the cache.
    • If the cache exceeds the set limit, it removes the least recently used item, which is the first item in the OrderedDict.

This implementation ensures efficient access and updates within the cache, keeping it within the defined limit by automatically removing the least recently used items.

Comments

No comments yet.