Practice/xAI/Weighted LRU Cache
CodingMust
Design and implement a Weighted LRU (Least Recently Used) Cache that extends the traditional LRU cache by assigning a size (or weight) to each item. Unlike a standard LRU cache where each item counts as 1 toward the capacity, in a weighted LRU cache, the capacity is calculated as the sum of all item sizes.
This variant is commonly used in systems where cached items have varying memory footprints, such as:
Implement a WeightedLRUCache class with get and put operations.
`
cache = WeightedLRUCache(capacity=10)
cache.put("a", 1, 3) # Cache: {"a": (1, size=3)} -> total size = 3 cache.put("b", 2, 4) # Cache: {"a": (1, 3), "b": (2, 4)} -> total size = 7 cache.put("c", 3, 5) # Exceeds capacity (7 + 5 = 12 > 10) # Evict "a" (LRU, size=3) -> total size = 4 # Now add "c" -> total size = 9 # Cache: {"b": (2, 4), "c": (3, 5)}
cache.get("a") # Returns -1 (evicted) cache.get("b") # Returns 2 (marks "b" as recently used)
cache.put("d", 4, 3) # Would exceed (9 + 3 = 12 > 10) # Evict "c" (LRU, size=5) -> total size = 4 # Add "d" -> total size = 7 # Cache: {"b": (2, 4), "d": (4, 3)} `
get(key): Retrieve value and mark as recently used, return -1 if not foundput(key, value, size): Insert or update, evicting LRU items if neededget() update the item's access time (making it "most recently used")?Extend your implementation to handle various edge cases that occur in production systems.
` cache = WeightedLRUCache(10) cache.put("huge", 1, 15) # Size 15 > capacity 10
`
cache = WeightedLRUCache(10) cache.put("a", 1, 3) # Total size = 3 cache.put("a", 100, 8) # Same key, different size # Total size changes from 3 to 8 cache.get("a") # Returns 100
cache = WeightedLRUCache(10) cache.put("a", 1, 3) cache.put("b", 2, 3) cache.put("c", 3, 3) # Total = 9 cache.put("d", 4, 8) # Needs to evict multiple items # Must evict "a", "b", and "c" to fit "d"
For production use, optimize to achieve O(1) time complexity for both get and put operations.
Use OrderedDict from Python's collections module:
move_to_end(key) moves an item to the end in O(1) timepopitem(last=False) removes from the front (LRU end) in O(1) time` from collections import OrderedDict
cache = OrderedDict() # key -> (value, size)
`
get() should be O(1)put() should be O(1) amortized (worst case involves evicting k items = O(k))