LRU Cache Implementation in Swift



 In Swift, LRU (Least Recently Used) is a caching strategy used to efficiently manage a fixed-size cache. It ensures that the least recently used item is evicted when the cache exceeds its capacity.

An LRU Cache in Swift typically uses:

  1. Dictionary: For O(1) access to cache items by their keys.
  2. Doubly Linked List: To maintain the order of usage, allowing O(1) insertion, deletion, and updating of the most and least recently used items.

How It Works:

  1. When a key is accessed or added:
    • If the key exists, move it to the head of the list (mark as most recently used).
    • If the key doesn’t exist, add it to the head of the list.
  2. If the cache exceeds its capacity:
    • Remove the node at the tail (least recently used item).
  3. The dictionary maps keys to linked list nodes for fast lookups.

Example Usage:

let cache = LRUCache<String, Int>(2)
cache.put("a", 1) // Cache: ["a": 1] cache.put("b", 2) // Cache: ["b": 2, "a": 1] print(cache.get("a")) // Access "a". Output: 1. Cache: ["a": 1, "b": 2] cache.put("c", 3) // Evicts "b". Cache: ["c": 3, "a": 1] print(cache.get("b")) // Output: nil (not found)

Key Features:

  • Efficient Operations: Both get and put are O(1).
  • Fixed Capacity: Automatically evicts the least recently used item when full.

This approach is ideal for scenarios like caching API responses or managing memory in apps.


Implementing

Implementing an LRU (Least Recently Used) Cache in Swift involves creating a data structure that allows for quick insertion, retrieval, and eviction of the least recently used items when the cache reaches its maximum capacity. The typical approach combines a doubly linked list and a dictionary.

Here's a detailed explanation along with a complete implementation.


Concept of LRU Cache

  • Key Idea: The least recently used item is evicted first when the cache exceeds its capacity.
  • Data Structures Used:
    1. Dictionary: Maps keys to nodes for O(1) access.
    2. Doubly Linked List: Maintains the order of use, where the most recently used item is at the head, and the least recently used item is at the tail.

Steps to Implement LRU Cache

  1. Node Class: Define a node class for the doubly linked list.
  2. Doubly Linked List:
    • Add nodes to the head for recently used items.
    • Remove nodes from the tail when the cache exceeds capacity.
    • Move a node to the head on access.
  3. Cache Class:
    • Use a dictionary to store references to the linked list nodes for O(1) access and updates.
    • Implement methods for get, put, and evicting the least recently used item.

Code Implementation in Swift

import Foundation // Doubly linked list node class DoublyLinkedListNode<Key: Hashable, Value> { var key: Key var value: Value var prev: DoublyLinkedListNode? var next: DoublyLinkedListNode? init(key: Key, value: Value) { self.key = key self.value = value } } // LRU Cache implementation class LRUCache<Key: Hashable, Value> { private let capacity: Int private var dict: [Key: DoublyLinkedListNode<Key, Value>] = [:] private var head: DoublyLinkedListNode<Key, Value>? private var tail: DoublyLinkedListNode<Key, Value>? init(_ capacity: Int) { self.capacity = max(1, capacity) // Ensure capacity is at least 1 } // Get a value from the cache func get(_ key: Key) -> Value? { guard let node = dict[key] else { return nil // Key not found } // Move the accessed node to the head (most recently used) moveToHead(node) return node.value } // Put a value into the cache func put(_ key: Key, _ value: Value) { if let node = dict[key] { // Key already exists, update value and move to head node.value = value moveToHead(node) } else { // Key does not exist, create a new node let newNode = DoublyLinkedListNode(key: key, value: value) dict[key] = newNode addToHead(newNode) // If the cache exceeds capacity, remove the least recently used item if dict.count > capacity { removeLeastRecentlyUsed() } } } // Remove the least recently used item private func removeLeastRecentlyUsed() { guard let lruNode = tail else { return } dict[lruNode.key] = nil removeNode(lruNode) } // Move a node to the head (most recently used position) private func moveToHead(_ node: DoublyLinkedListNode<Key, Value>) { removeNode(node) addToHead(node) } // Add a node to the head private func addToHead(_ node: DoublyLinkedListNode<Key, Value>) { node.next = head node.prev = nil head?.prev = node head = node if tail == nil { tail = head } } // Remove a node from the linked list private func removeNode(_ node: DoublyLinkedListNode<Key, Value>) { let prevNode = node.prev let nextNode = node.next if let prev = prevNode { prev.next = nextNode } else { head = nextNode } if let next = nextNode { next.prev = prevNode } else { tail = prevNode } node.prev = nil node.next = nil } } // Example usage let cache = LRUCache<String, Int>(3) cache.put("a", 1) // Cache: ["a": 1] cache.put("b", 2) // Cache: ["b": 2, "a": 1] cache.put("c", 3) // Cache: ["c": 3, "b": 2, "a": 1] print(cache.get("a") ?? "Not found") // Access "a" (most recently used now). Output: 1 cache.put("d", 4) // "b" is evicted. Cache: ["d": 4, "a": 1, "c": 3] print(cache.get("b") ?? "Not found") // "b" is not in the cache. Output: Not found print(cache.get("c") ?? "Not found") // Access "c". Output: 3

Explanation of Key Parts

  1. Node Structure:

    • Each node stores a key and value.
    • prev and next pointers enable efficient removal and reordering.
  2. Dictionary:

    • Maps Key to DoublyLinkedListNode.
    • Provides O(1) access to nodes.
  3. Doubly Linked List:

    • addToHead: Adds a node at the head for most recently used items.
    • removeNode: Removes a specific node efficiently.
    • moveToHead: Repositions a node at the head.
  4. Cache Operations:

    • put:
      • Updates or inserts the key-value pair.
      • Maintains capacity by evicting the least recently used item if necessary.
    • get:
      • Retrieves the value for a key and updates its position to the head.

Performance Analysis

  1. Time Complexity:
    • get: O(1) (Dictionary lookup + node repositioning).
    • put: O(1) (Dictionary update + node addition/removal).
  2. Space Complexity:
    • O(n), where n is the cache capacity.

Best Practices

  1. Testing:
    • Test with edge cases such as:
      • Accessing keys not in the cache.
      • Inserting into a cache of size 1.
      • Frequent evictions.
  2. Thread Safety:
    • If using in a multithreaded environment, ensure thread safety with synchronization mechanisms like locks.

This implementation ensures efficient management of recently used items while adhering to the LRU cache policy.

Post a Comment

Cookie Consent
Zupitek's serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.