🔵
Claude 3 Opus
Anthropic
1500
(+0)
VS
DRAW
🔴
Claude 3 Haiku
Anthropic
1500
(+0)
🧑⚖️ Judge's Verdict
Both responses provide essentially identical, correct implementations of an LRU cache with O(1) operations. Both use the same optimal approach (doubly linked list + hash map with dummy nodes), have proper edge case handling, include comprehensive test cases, and provide clear documentation. Response A has slightly more detailed explanatory text and reasoning about why the approach works, while Response B provides actual sample output and a more structured complexity analysis section. However, these are minor stylistic differences in presentation rather than substantive differences in correctness, completeness, or technical quality. The core implementations are functionally equivalent and both meet all criteria excellently.
Claude 3 Opus's Response
# LRU Cache Implementation
I'll implement an LRU Cache with O(1) time complexity for both get and put operations using a combination of a doubly linked list and a hash map.
```python
class Node:
"""Doubly linked list node to store key-value pairs"""
def __init__(self, key=0, value=0):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
"""
LRU Cache with O(1) get and put operations.
Uses:
- HashMap for O(1) key lookup
- Doubly Linked List for O(1) insertion/deletion
Most recently used items are at the head, least recently used at the tail.
"""
def __init__(self, capacity: int):
"""
Initialize LRU cache with given capacity.
Args:
capacity: Maximum number of items the cache can hold
"""
self.capacity = capacity
self.cache = {} # key -> Node mapping
# Dummy head and tail nodes to simplify edge cases
self.head = Node()
self.tail = Node()
self.head.next = self.tail
self.tail.prev = self.head
def _remove_node(self, node: Node) -> None:
"""Remove a node from the doubly linked list."""
prev_node = node.prev
next_node = node.next
prev_node.next = next_node
next_node.prev = prev_node
def _add_to_head(self, node: Node) -> None:
"""Add a node right after the head (most recently used position)."""
node.prev = self.head
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
def _move_to_head(self, node: Node) -> None:
"""Move an existing node to the head (mark as recently used)."""
self._remove_node(node)
self._add_to_head(node)
def _remove_tail(self) -> Node:
"""Remove and return the node before the tail (least recently used)."""
lru_node = self.tail.prev
self._remove_node(lru_node)
return lru_node
def get(self, key: int) -> int:
"""
Get the value for a key. Returns -1 if key doesn't exist.
Marks the key as recently used.
Time Complexity: O(1)
"""
if key not in self.cache:
return -1
node = self.cache[key]
self._move_to_head(node) # Mark as recently used
return node.value
def put(self, key: int, value: int) -> None:
"""
Insert or update a key-value pair.
If cache is at capacity, removes the least recently used item.
Time Complexity: O(1)
"""
if key in self.cache:
# Update existing key
node = self.cache[key]
node.value = value
self._move_to_head(node)
else:
# Add new key
new_node = Node(key, value)
self.cache[key] = new_node
self._add_to_head(new_node)
if len(self.cache) > self.capacity:
# Remove least recently used item
lru_node = self._remove_tail()
del self.cache[lru_node.key]
def display(self) -> list:
"""Display cache contents from most to least recently used."""
result = []
current = self.head.next
while current != self.tail:
result.append((current.key, current.value))
current = current.next
return result
# Example usage and testing
def test_lru_cache():
print("=== LRU Cache Testing ===\n")
# Test 1: Basic operations
print("Test 1: Basic operations")
cache = LRUCache(2)
cache.put(1, 1)
cache.put(2, 2)
print(f"After put(1,1), put(2,2): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns 1
cache.put(3, 3) # Evicts key 2
print(f"After put(3,3): {cache.display()}")
print(f"get(2): {cache.get(2)}") # Returns -1 (not found)
cache.put(4, 4) # Evicts key 1
print(f"After put(4,4): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns -1 (not found)
print(f"get(3): {cache.get(3)}") # Returns 3
print(f"get(4): {cache.get(4)}") # Returns 4
print()
# Test 2: Update existing key
print("Test 2: Update existing key")
cache = LRUCache(2)
cache.put(1, 1)
cache.put(2, 2)
cache.put(1, 10) # Update key 1
print(f"After put(1,1), put(2,2), put(1,10): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns 10
print()
# Test 3: Single capacity
print("Test 3: Single capacity")
cache = LRUCache(1)
cache.put(1, 1)
cache.put(2, 2) # Evicts key 1
print(f"After put(1,1), put(2,2): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns -1
print(f"get(2): {cache.get(2)}") # Returns 2
if __name__ == "__main__":
test_lru_cache()
```
## Key Design Decisions
### Data Structures:
1. **Hash Map (dict)**: Maps keys to nodes for O(1) lookup
2. **Doubly Linked List**: Maintains order of usage
- Most recently used items near the head
- Least recently used items near the tail
- Dummy head and tail nodes simplify edge cases
### Time Complexity:
- **get()**: O(1) - Hash map lookup + linked list operations
- **put()**: O(1) - Hash map insert/update + linked list operations
### Space Complexity:
- O(capacity) - Stores at most `capacity` nodes
### Why This Works:
1. Hash map provides instant access to any node
2. Doubly linked list allows O(1) removal and insertion
3. Moving nodes to head marks them as recently used
4. Removing from tail evicts least recently used items
This implementation is commonly used in real-world applications like web caches, database query caches, and browser history management.
Claude 3 Haiku's Response
# LRU Cache Implementation
I'll implement an LRU Cache with O(1) time complexity for both get and put operations using a combination of a **doubly linked list** and a **hash map**.
## Strategy
- **Doubly Linked List**: Maintains order of usage (most recent at head, least recent at tail)
- **Hash Map**: Provides O(1) lookup of nodes by key
```python
class Node:
"""Doubly linked list node"""
def __init__(self, key=0, value=0):
self.key = key
self.value = value
self.prev = None
self.next = None
class LRUCache:
def __init__(self, capacity: int):
"""
Initialize LRU cache with given capacity
Time: O(1), Space: O(capacity)
"""
self.capacity = capacity
self.cache = {} # key -> Node mapping
# Dummy head and tail nodes to simplify edge cases
self.head = Node()
self.tail = Node()
self.head.next = self.tail
self.tail.prev = self.head
def _remove(self, node: Node) -> None:
"""Remove node from doubly linked list - O(1)"""
prev_node = node.prev
next_node = node.next
prev_node.next = next_node
next_node.prev = prev_node
def _add_to_head(self, node: Node) -> None:
"""Add node right after head (most recently used position) - O(1)"""
node.next = self.head.next
node.prev = self.head
self.head.next.prev = node
self.head.next = node
def _move_to_head(self, node: Node) -> None:
"""Move existing node to head - O(1)"""
self._remove(node)
self._add_to_head(node)
def _remove_tail(self) -> Node:
"""Remove least recently used node (before tail) - O(1)"""
lru_node = self.tail.prev
self._remove(lru_node)
return lru_node
def get(self, key: int) -> int:
"""
Get value for key, return -1 if not exists
Time: O(1), Space: O(1)
"""
if key not in self.cache:
return -1
# Move accessed node to head (mark as recently used)
node = self.cache[key]
self._move_to_head(node)
return node.value
def put(self, key: int, value: int) -> None:
"""
Insert or update key-value pair
Time: O(1), Space: O(1)
"""
if key in self.cache:
# Update existing key
node = self.cache[key]
node.value = value
self._move_to_head(node)
else:
# Add new key
new_node = Node(key, value)
self.cache[key] = new_node
self._add_to_head(new_node)
# Check capacity
if len(self.cache) > self.capacity:
# Remove LRU item
lru_node = self._remove_tail()
del self.cache[lru_node.key]
def display(self) -> list:
"""Display cache contents from most to least recent (for debugging)"""
result = []
current = self.head.next
while current != self.tail:
result.append(f"{current.key}:{current.value}")
current = current.next
return result
# Test Cases
def test_lru_cache():
print("Test 1: Basic operations")
cache = LRUCache(2)
cache.put(1, 1)
print(f"After put(1,1): {cache.display()}")
cache.put(2, 2)
print(f"After put(2,2): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns 1
print(f"After get(1): {cache.display()}")
cache.put(3, 3) # Evicts key 2
print(f"After put(3,3): {cache.display()}")
print(f"get(2): {cache.get(2)}") # Returns -1 (not found)
cache.put(4, 4) # Evicts key 1
print(f"After put(4,4): {cache.display()}")
print(f"get(1): {cache.get(1)}") # Returns -1
print(f"get(3): {cache.get(3)}") # Returns 3
print(f"get(4): {cache.get(4)}") # Returns 4
print("\n" + "="*50 + "\n")
print("Test 2: Update existing key")
cache2 = LRUCache(2)
cache2.put(1, 1)
cache2.put(2, 2)
print(f"Initial: {cache2.display()}")
cache2.put(1, 10) # Update value
print(f"After put(1,10): {cache2.display()}")
print(f"get(1): {cache2.get(1)}") # Should return 10
print("\n" + "="*50 + "\n")
print("Test 3: Capacity 1")
cache3 = LRUCache(1)
cache3.put(1, 1)
print(f"After put(1,1): {cache3.display()}")
cache3.put(2, 2) # Should evict 1
print(f"After put(2,2): {cache3.display()}")
print(f"get(1): {cache3.get(1)}") # -1
print(f"get(2): {cache3.get(2)}") # 2
if __name__ == "__main__":
test_lru_cache()
```
## Output
```
Test 1: Basic operations
After put(1,1): ['1:1']
After put(2,2): ['2:2', '1:1']
get(1): 1
After get(1): ['1:1', '2:2']
After put(3,3): ['3:3', '1:1']
get(2): -1
After put(4,4): ['4:4', '3:3']
get(1): -1
get(3): 3
get(4): 4
==================================================
Test 2: Update existing key
Initial: ['2:2', '1:1']
After put(1,10): ['1:10', '2:2']
get(1): 10
==================================================
Test 3: Capacity 1
After put(1,1): ['1:1']
After put(2,2): ['2:2']
get(1): -1
get(2): 2
```
## Complexity Analysis
- **Time Complexity**: O(1) for both `get` and `put`
- **Space Complexity**: O(capacity) for storing nodes and hash map entries
## Key Design Points
1. **Dummy nodes** simplify boundary conditions
2. **Hash map** provides O(1) node lookup
3. **Doubly linked list** enables O(1) removal and insertion
4. **Head = most recent**, **Tail = least recent**