Skip to content

ConcurrentLfu

Alex Peck edited this page Sep 11, 2022 · 10 revisions

ConcurrentLfu approximates an LFU using the W-TinyLfu eviction policy. W-TinyLfu tracks items using a window LRU list, and a main space LRU divided into protected and probation segments. Reads and writes to the cache are stored in buffers and later applied to the policy LRU lists in batches under a lock. Each read and write is tracked using a compact popularity sketch to probalistically estimate item frequency.

Items proceed through the LRU lists as follows:

  • New items are added to the window LRU. When acessed window items move to the window MRU position.
  • When the window is full, candidate items are moved to the probation segment in LRU order.
  • When the main space is full, the access frequency of each window candidate is compared to probation victims in LRU order. The item with the lowest frequency is evicted until the cache size is within bounds.
  • When a probation item is accessed, it is moved to the protected segment. If the protected segment is full, the LRU protected item is demoted to probation.
  • When a protected item is accessed, it is moved to the protected MRU position.

The size of the admission window and main space are adapted over time to iteratively improve hit rate using a hill climbing algorithm. A larger window favors workloads with high recency bias, whereas a larger main space favors workloads with frequency bias.

Clone this wiki locally