The Sunday Solver: LRU Cache

Tackle a common coding interview question: the LRU Cache.

You can also take a look at this and more stories at:

Interview questions can be hard to think of on the spot, but the best way to tackle interview questions is understanding the best way to approach them. Here, we will tackle a common interview question: creating an LRU Cache. An LRU Cache is a common type of is a type of scheme used to retrieve data from a database. Often, a database has much more memory than the memory on any given user’s computer. So how does the database send data when application memory runs out? The database must replace the existing information, and LRU or Least Recently Used is a specific type of replacement policy. This may seem a little complicated at first, so let’s break it down. Say the database is filled with letters and contains A, W, A, R, E.

Now, a user’s computer can only hold 3 letters at once, so if the user is trying to read all the letters, the computer will first fetch as many letters as it can. It first fetches A, then W. Since A is being asked for again, it becomes the most recently used item, then R is fetched. So, in increasing order of use we have W, A, R, but now the computer is out of memory! So an LRU policy would see that the letter W has been accessed last and will replace that letter with E. Now the computer has A, R, E in increasing order of use. This can go on for even longer lists.

Now that we have a better understanding of the problem, we can start writing the code for the LRU Cache. I decided to tackle this problem in Java because utils we can use are easily accessible. So first, let’s create the LRU Cache class. We want to create a class that can accept any type of variable. When an LRU Cache is created, it also has a limit on the number of items it can store at once, so we want to instantiate it with a given limit. So far:

public class LRUCache<K> {
int limit;
LRUCache(int limit) {
this.limit = limit;

Alright, well that wasn’t so bad. We also know that keeping track of all the variables that we add is important. Keeping track of variables is easiest in a list, and it seems that we will be adding to the end of this list often, and removing from the front or the middle. LinkedLists have the best performance in removing items given a certain variable, and adding to the front of the list, so we will import LinkedLists from java.utils.

import java.util.LinkedList;public class LRUCache<K> {
LinkedList<K> values;
int limit;
LRUCache(int limit) {
this.values = new LinkedList<K>();
this.limit = limit;

When you try to add an item to an LRU, first we must check if it already exists in the current data structure. If it does, we want to remove it and add it to the end of the list. Else, we must check if the size of our values list is greater than the limit we set earlier; if it is, remove the first item from the list and add the new variable to the end, otherwise simply add the variable to the end. The first case makes the item the most recently used item signifying that it should be removed later rather than sooner. The second case removes the least recently used item and replaces it with the new variable. This seems pretty straight forward, so let’s make two functions add and exists that will return variable K.

K add(K value) {
K checkVar = this.exists(value);
if (checkVar == null) { // var is not in list
K newVar = value;
if (this.values.size() > this.limit) {
// limit is exceeded remove first item.
return newVar;
} else {
// item is in list so we must remove it and add it to the end
return checkVar;
K exists(K value) {
// go through each item and check if that is the var
for (K access : this.values) {
if (access.current.equals(value)) {
return access;
// item is not in list
return null;

Alright, well that was pretty straightforward. If we try running this program, we can simulate an LRU cache by creating a main function and a print function just to test the output of the program. Add these two functions inside the class:

void printList() {
for (K node: this.values) {
System.out.print(node.current + " ");
public static void main(String[] args) {
String[] testList = new String[]{"A", "W", "A", "R", "E"};
int testLimit = 3;
LRUCache<String> testCache = new LRUCache<String>(testLimit);
for (String i : testList) {

Testing this out, we see that we get the same results that we expected from earlier: A, R, E! You can try out the cache with other types and make it even better by implementing a private Node class. See my final solution here.

Extra for Experts

I thought I was really cool and came up with an amazing solution when I first tackled this problem, but it turns out there is an even faster solution. If you think about it, the exists function takes O(N) time because it has to traverse the whole list to find the variable we are looking for. On your own: What data structure allows you to “get” an item from it in constant time? Try implementing this data structure with our current algorithm. This small change makes the overall solution constant time, which is really really good.

Hello, I’m Hershal Bhatia, currently a Software Engineer at Amazon – AWS where I am building out a new service. Check out more at

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store