Skip to content

bitfaster/BitFaster.Caching

This branch is 1 commit ahead of main.

Folders and files

NameName
Last commit message
Last commit date
Feb 15, 2025
Feb 15, 2025
Feb 15, 2025
Feb 15, 2025
Jan 22, 2025
Apr 7, 2025
Apr 7, 2025
Nov 17, 2024
Nov 2, 2023
Jun 11, 2020
May 11, 2024
Jun 14, 2020
Jun 14, 2020
Jun 11, 2020
Nov 25, 2023
Aug 17, 2023

Repository files navigation

⚡ BitFaster.Caching

High performance, thread-safe in-memory caching primitives for .NET.

NuGet version Nuget main Coverage Status

Features

Documentation

Please refer to the wiki for full API documentation, and a complete analysis of hit rate, latency and throughput.

Getting started

BitFaster.Caching is installed from NuGet:

dotnet add package BitFaster.Caching

ConcurrentLru

ConcurrentLru is a light weight drop in replacement for ConcurrentDictionary, but with bounded size enforced by the TU-Q eviction policy (derived from 2Q). There are no background threads, no global locks, concurrent throughput is high, lookups are fast and hit rate outperforms a pure LRU in all tested scenarios.

Choose a capacity and use just like ConcurrentDictionary, but with bounded size:

int capacity = 128;
var lru = new ConcurrentLru<string, SomeItem>(capacity);

var value = lru.GetOrAdd("key", (key) => new SomeItem(key));

ConcurrentLfu

ConcurrentLfu is a drop in replacement for ConcurrentDictionary, but with bounded size enforced by the W-TinyLFU eviction policy. ConcurrentLfu has near optimal hit rate and high scalability. Reads and writes are buffered then replayed asynchronously to mitigate lock contention.

Choose a capacity and use just like ConcurrentDictionary, but with bounded size:

int capacity = 128;
var lfu = new ConcurrentLfu<string, SomeItem>(capacity);

var value = lfu.GetOrAdd("key", (key) => new SomeItem(key));