I've just finished my Java implementation of a multi-level pseudo-LRU cache.
The abstract code is just 10KB, and a simple direct implementation of the abstract classes is a kilobyte and a half.
Initially, I though of implementing a LFU cache, but then I realised that for caches that are large enough, the pseudo-LRU algorithm would work nicely.
The four main actions on a cache are:
What's special about my cache is that:
On the other hand, one can have multiple smaller specific front caches, all backed-up by a single larger back cache. This would help if there's a tendency to have items from the same type at a given time, effectively expanding their local caches.
The abstract code is just 10KB, and a simple direct implementation of the abstract classes is a kilobyte and a half.
Initially, I though of implementing a LFU cache, but then I realised that for caches that are large enough, the pseudo-LRU algorithm would work nicely.
The four main actions on a cache are:
- adding an item to the cache
- getting an item from the cache
- invalidating an item in the cache
- trashing the whole cache
What's special about my cache is that:
- It can chain multiple caches, so that removed items from a front cache are not immediately lost, but moved to a back cache.
On the other hand, one can have multiple smaller specific front caches, all backed-up by a single larger back cache. This would help if there's a tendency to have items from the same type at a given time, effectively expanding their local caches.
- When there's not enough space in the cache, items need to be removed. Often, only a few items need to be removed at a time. Removing a very small amount of items would mean that the next time one needs to add an item there might again be not enough space in the cache. To remedy this inefficiency, one can choose a minimum limit for the amount space to be freed when trash cleaning is triggered.
No comments:
Post a Comment