LRU Cache Implementation

Algorithms
Hard
Uber
27.2K views

Design and implement a Least Recently Used (LRU) cache. It should support `get` and `put` operations in $O(1)$ time complexity using a HashMap and a Doubly Linked List.

Why Interviewers Ask This

Uber asks this to evaluate a candidate's ability to balance memory constraints with performance requirements in high-throughput systems. They specifically test mastery of pointer manipulation, data structure composition, and the rigorous application of O(1) complexity constraints. This problem reveals whether you can architect solutions that handle real-time ride-matching latency without degrading system stability under heavy load.

How to Answer This Question

First, clarify the cache capacity and expected behavior for existing keys during updates. Second, propose the dual-structure architecture: a HashMap for O(1) key lookup and a Doubly Linked List to maintain access order efficiently. Third, outline the logic for both operations separately. For 'get', explain moving the accessed node to the head and returning the value. For 'put', describe adding new nodes to the head or updating existing ones, followed by evicting the tail if capacity is exceeded. Fourth, walk through a specific scenario, such as inserting three items into a size-two cache, to demonstrate the eviction logic visually. Finally, analyze the time and space complexity, explicitly confirming O(1) for both operations and discussing edge cases like zero capacity or null inputs.

Key Points to Cover

  • Explicitly state the O(1) time complexity requirement before writing code
  • Demonstrate clear understanding of how the HashMap and Doubly Linked List interact
  • Correctly handle the edge case of updating an existing key versus inserting a new one
  • Show precise pointer manipulation when moving nodes to the head or removing the tail
  • Discuss space complexity and how the solution scales with cache capacity

Sample Answer

To implement an LRU Cache with O(1) operations, I will combine a HashMap and a Doubly Linked List. The HashMap maps keys directly to their corresponding list nodes, enabling instant access. The Doubly Linked List maintai…

Common Mistakes to Avoid

  • Using a standard List or Array instead of a Doubly Linked List, resulting in O(n) search times for moving elements
  • Failing to update the pointers correctly when moving a node to the head, causing list corruption
  • Not checking if the cache has reached capacity before inserting a new element, leading to memory leaks
  • Ignoring the scenario where the input capacity is zero or negative, causing runtime errors

Sound confident on this question in 5 minutes

Answer once and get a 30-second AI critique of your structure, content, and delivery. First attempt is free — no signup needed.

Try it free

Related Interview Questions

Browse all 145 Algorithms questionsBrowse all 57 Uber questions