LRU implementation in production code

Recently I implemented a LRU cache using a linked list spread over a hash map. /// Typedef for URL/Entry pair typedef std::pair< std::string, Entry > EntryPair; /// Typedef for Cache list typedef std::list< EntryPair > CacheList; /// Typedef for URL-indexed map into the CacheList typedef boost::unordered_map< std::string, CacheList::iterator > CacheMap; /// Cache LRU list CacheList … Read more

Make @lru_cache ignore some of the function arguments

With cachetools you can write: from cachetools import cached from cachetools.keys import hashkey from random import randint @cached(cache={}, key=lambda db_handle, query: hashkey(query)) def find_object(db_handle, query): print(“processing {0}”.format(query)) return query queries = list(range(5)) queries.extend(range(5)) for q in queries: print(“result: {0}”.format(find_object(randint(0, 1000), q))) You will need to install cachetools (pip install cachetools). The syntax is: @cached( cache={}, … Read more

Python functools lru_cache with instance methods: release object

This is not the cleanest solution, but it’s entirely transparent to the programmer: import functools import weakref def memoized_method(*lru_args, **lru_kwargs): def decorator(func): @functools.wraps(func) def wrapped_func(self, *args, **kwargs): # We’re storing the wrapped method inside the instance. If we had # a strong reference to self the instance would never die. self_weak = weakref.ref(self) @functools.wraps(func) @functools.lru_cache(*lru_args, … Read more

LRU cache design

A linked list + hashtable of pointers to the linked list nodes is the usual way to implement LRU caches. This gives O(1) operations (assuming a decent hash). Advantage of this (being O(1)): you can do a multithreaded version by just locking the whole structure. You don’t have to worry about granular locking etc. Briefly, … Read more

Easy, simple to use LRU cache in java

You can use a LinkedHashMap (Java 1.4+) : // Create cache final int MAX_ENTRIES = 100; Map cache = new LinkedHashMap(MAX_ENTRIES+1, .75F, true) { // This method is called just after a new entry has been added public boolean removeEldestEntry(Map.Entry eldest) { return size() > MAX_ENTRIES; } }; // Add to cache Object key = … Read more

How would you implement an LRU cache in Java?

I like lots of these suggestions, but for now I think I’ll stick with LinkedHashMap + Collections.synchronizedMap. If I do revisit this in the future, I’ll probably work on extending ConcurrentHashMap in the same way LinkedHashMap extends HashMap. UPDATE: By request, here’s the gist of my current implementation. private class LruCache<A, B> extends LinkedHashMap<A, B> … Read more