Skip to main content

What are the use cases for key-value caching?

The essence of KV Cache is to reduce data access latency. For example, it transforms the O(logN) read/write and complex queries on a database that is expensive and slow into O(1) read/writes on a medium that is fast but also costly. There are many strategies for cache design, with common ones being read-through/write-through (or write-back) and cache aside.

The typical read/write ratio for internet services ranges from 100:1 to 1000:1, and we often optimize for reads.

In distributed systems, these patterns represent trade-offs between consistency, availability, and partition tolerance, and the specific choice should be based on your business needs.

General Strategies

  • Read
    • Read-through: A cache layer is added between clients and databases, so clients do not access the database directly but instead access it indirectly through the cache. If the cache is empty, it updates from the database and returns the data; if not, it returns the data directly.
  • Write
    • Write-through: Clients first write data to the cache, which then updates the database. The operation is considered complete only when the database is updated.
    • Write-behind/Write-back: Clients first write data to the cache and receive a response immediately. The cache is then asynchronously updated to the database. Generally, write-back is the fastest.
    • Write-around: Clients write directly to the database, bypassing the cache.

Cache Aside Pattern

Use the Cache Aside pattern when the cache does not support read-through and write-through/write-behind.

Reading data? If the cache hits, read from the cache; if it misses, read from the database and store in the cache. Modifying data? First modify the database, then delete the cache entry.

Why not update the cache after writing to the database? The main concern is that two concurrent database write operations could lead to two concurrent cache updates, resulting in dirty data.

Does using Cache Aside eliminate concurrency issues? There is still a low probability of dirty data occurring, especially when reading from the database and updating the cache while simultaneously updating the database and deleting the cache entry.

Where to Place the Cache?

  • Client side,
  • Distinct layer,
  • Server side.

What to Do If the Cache Size Is Insufficient? Cache Eviction Strategies

  • LRU - Least Recently Used: Keeps track of time and retains the most recently used items, evicting those that have not been used recently.
  • LFU - Least Frequently Used: Tracks usage frequency, retaining the most frequently used items and evicting the least frequently used ones.
  • ARC: Performs better than LRU by maintaining both recently used (RU) and frequently used (FU) items, while also recording the history of recently evicted items.

Which Cache Solution Is the Best?

Facebook TAO
References: