assorted wine bottles

LRU Cache

What Is LRU Cache?

A cache is a hardware or software component, that stores data so that future retrieval requests can be executed quickly. Each cache operates according to a specific strategy that determines which data is supposed to be stored. One of these cache strategies is the Least Recently Used cache, or LRU cache. A LRU cache retains the most recently accessed items while removing the least recently used ones.

Implementations

Implement a LRU cache which satisfies the following criteria:

  • Retains a specified number of key-value pairs.
  • Given a key, returns the corresponding value in \(O(1)\) time.
  • Given a key-value pair, if the key already exists, update the value. Otherwise, add it to the cache. These operations should be executed in \(O(1)\) time. Note that if the number of keys exceeds the capacity, evict the least recently used key.

1. Doubly Linked List

Retrieving and updating operations can be executed in \(O(1)\) time using a hash table. Additionally, in order to keep track of the order in which keys have been used, we can use a queue; its head represents the least recently used item, and its tail represents the most recently used one.

So, when retrieving an item, we need to add it to the tail of the queue, and remove the existing item from the queue. To execute them in \(O(1)\) time, each item must have a pointer to the previous item in addition to a pointer to the next one. Consequently, we can utilize a doubly linked list.

Java
class LRUCache {

    static class ListNode{
        int key;
        int val;
        ListNode prev;
        ListNode next;
        public ListNode(int key, int val){
            this.key = key;
            this.val = val;
        }
    }

    int capacity;
    Map<Integer, ListNode> map;
    ListNode head;
    ListNode tail;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>();
        head = new ListNode(-1, -1);
        tail = new ListNode(-1, -1);
        head.next = tail;
        tail.prev = head;
    }

    public int get(int key) {
        if(map.containsKey(key)){
            ListNode node = map.get(key);

            node.prev.next = node.next;
            node.next.prev = node.prev;

            ListNode prevTail = tail.prev;
            prevTail.next = node;
            node.prev = prevTail;
            node.next = tail;
            tail.prev = node;

            return node.val;
        }
        return -1;
    }

    public void put(int key, int value) {
        if(map.containsKey(key)){
            ListNode existing = map.get(key);
            existing.prev.next = existing.next;
            existing.next.prev = existing.prev;
        }
        ListNode node = new ListNode(key, value);
        ListNode prevTail = tail.prev;
        prevTail.next = node;
        node.prev = prevTail;
        node.next = tail;
        tail.prev = node;

        map.put(key, node);

        if(map.size() > capacity){
            ListNode nextHead = head.next;
            head.next = nextHead.next;
            nextHead.next.prev = head;
            int k = nextHead.key;
            map.remove(k);
        }

    }
}

2. Built-in LinkedHashMap

LinkedHashMap, a built-in Java class, maintains the order in which keys have been used. We can utilize it as a LRU cache by overriding its protected removeEldestEntry method, which enables us to control when the eldest entry should be evicted:

Java
@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest){
    return size() > capacity;
}

Reference


Posted

in