Wednesday, June 23, 2010

Cache - Least Recently Used Algorithm

LRU algorithm is heavily used in most of the cache implementation. In general, most of the time we depend on heavy weighted caching implementation and end up paying huge for that softwares.

java.util.LinkedHashMap provides simplest way to do this by overriding a method removeEldestEntry().

Here, I would like to present as an utilit class which will be reused our needs. New class LRUHashMap created from LinkedHashMap and cacheSize limit has to be passed as argument.

LRUHashMap Implementation

LRUHashMap.java 
import java.util.LinkedHashMap;
import java.util.Map;

/**
 * LRUHashMap implements Least Recently Used algorithm to store and retrive the
 * values from {Map}.
 */

public class LRUHashMap<K, V> extends LinkedHashMap<K, V> {

 private static final long serialVersionUID = -6805360112277349979L;
 private final static int DEFAULT_INITIALCAPACITY = 10;
 private final static float LOADFACTOR = 0.75f;

 /**
  * Number of entries possible to keep maximum in given time in this {Map}
  */
 private final int cacheSize;

 public LRUHashMap() {
  super(DEFAULT_INITIALCAPACITY, LOADFACTOR, true);
  cacheSize = -1;
 }

 public LRUHashMap(int cacheSize) {
  super(DEFAULT_INITIALCAPACITY, LOADFACTOR, true);
  this.cacheSize = cacheSize;
 }
 
 /**
  * @param initialCapacity
  */
 public LRUHashMap(int initialCapacity, int cacheSize) {
  super(initialCapacity, LOADFACTOR, true);
  this.cacheSize = cacheSize;
 }

 /**
  * @param m
  */
 public LRUHashMap(Map<K, V> m, int cacheSize) {
  super(DEFAULT_INITIALCAPACITY, LOADFACTOR, true);
  putAll(m);
  this.cacheSize = cacheSize;
 }

 /**
  * @param initialCapacity
  * @param loadFactor
  */
 public LRUHashMap(int initialCapacity, float loadFactor, int cacheSize) {
  super(initialCapacity, loadFactor, true);
  this.cacheSize = cacheSize;
 }



 
 /** To achieve Thread Safe
 * @return
 */
 public Map<K, V> getsynchronizedMap()
 {
  return Collections.synchronizedMap(this);
 }

 @Override
 protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
  return size() > cacheSize;
 }
}

Test

LRUHashMapTest.java

public class LRUHashMapTest {

 public static void main(String[] args) {
  Map<Integer, String> map = new LRUHashMap<Integer, String>(10);
  
  //Enables Thread Safe
  //map = ((LRUHashMap<Integer, String>)map).getsynchronizedMap();
  
  for (int i = 0; i < 100; i++) {
   map.put(i, "valueof :" + i);

   for (int j = i - 5; j >= 0; j--)
    map.get(j);
  }

  System.out.println(map);
 }
}

Here, Cache Size set to 10, hence at any given time only 10 entries will be stored in Map. Additionally, whenever read access happens that object will be removed from actual location and added to rear of Map.

Output:

{95=valueof :95, 96=valueof :96, 97=valueof :97, 98=valueof :98, 99=valueof :99,
 4=valueof :4, 3=valueof :3, 2=valueof :2, 1=valueof :1, 0=valueof :0}

Thread Safe

The above mentioned implementation alone will not achieve thread safe. If our application is multi-threaded then we may need to fall-back to Collections.SynchronizedMap. We can achieve this by calling getsynchronizedMap(); in LRUHashMap.
Map<Integer, String> map = ((LRUHashMap<Integer, String>)map).getsynchronizedMap();

No comments:

Post a Comment

Recent Posts

Unix Commands | List all My Posts

Texts

This blog intended to share the knowledge and contribute to JAVA Community such a way that by providing samples and pointing right documents/webpages. We try to give our knowledege level best and no guarantee can be claimed on truth. Copyright and Terms of Policy refer blogspot.com