Jump to content

Cache replacement policies

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Chiccodoro (talk | contribs) at 09:19, 15 June 2007 (Inserted content of Most Recently Used.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Cache algorithms (also frequently called replacement algorithms or replacement policy) are optimizing instructions – algorithms – that a computer program or a hardware-maintained structure can follow to manage a cache of information stored on the computer. Cache size is usually limited, and if the cache is full, the computer (that is, the programmer) must decide which items to keep and which to discard to make room for new items.

Examples of caching algorithms are:

Least Recently Used (LRU)
LRU discards the least recently used items first. This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item.
Most Recently Used (MRU)
MRU discards, in contrast to LRU, the most recently used items first. This caching mechanism is used when access is unpredictable, and determining the least most recently used section of the cache system is a high time complexity operation. A common example of this is database memory caches.
Pseudo-LRU (PLRU)
For caches with large associativity (generally >4 ways), the implementation cost of LRU becomes prohibitive. If a probabilistic scheme that almost always discards one of the least recently used items is sufficient, the Pseudo-LRU algorithm can be used which only needs one bit per cache item to work.
Least Frequently Used (LFU)
LFU counts how often an item is needed. Those that are used least often are discarded first.
Belady's Min
The most efficient caching algorithm would be to always discard the information that will not be needed for the longest time in the future. Since it is impossible to predict how far in the future information will be needed, this is not implementable in hardware, however (with pre-defined simulations) it can be used as a gauge as to the effectiveness of other algorithms.
Adaptive Replacement Cache (ARC)
ARC improves on LRU by constantly balancing between recency and frequency

Other things to consider:

  • Cost: if items have different costs, keep those items that are expensive to obtain, e.g. those that take a long time to get.
  • Size: If items have different sizes, the cache may want to discard a large item to store several smaller ones.
  • Time: Some caches keep information that expires (e.g. a news cache, a DNS cache, or a web browser cache). The computer may discard items because they are expired. Depending on the size of the cache no further caching algorithm to discard items may be necessary.

Various algorithms also exist to maintain cache coherency.

See also