Back to all work
FlipCache background
2024 → PresentOpen-source on PyPI3 min read

FlipCache

A lightweight Python cache that keeps hot keys in-process, syncs them to Redis, and ships FIFO/LRU utilities for both async and threaded workloads.

Ships thread-safe and asyncio-safe eviction layers alongside the hybrid Redis cache.

Project Metrics

Read acceleration~1300× vs raw Redis GET
Releasev1.3 (PyPI)
Pythonredis-pyasyncio

FlipCache

Why I Built It

  • While integrating data-heavy Python services, I wanted a cache that felt as quick as an in-memory dict but still survived process restarts.
  • Redis alone was reliable but latency spiked for frequently accessed, short-lived data; pure in-memory caches were fast but volatile.
  • FlipCache gives me the hybrid behaviour I wanted: hot data stays in the process, everything else lives in Redis with namespaced keys and TTLs.

What It Offers

  • Two-tier storage – an OrderedDict front cache guarded by local_max, with Redis acting as the durable backing store.
  • Type-aware serialization – built-in support for str, int, and JSON payloads, plus plug-in encoders/decoders when value_type="custom".
  • TTL controls – configure expirations per cache instance and optionally refresh them on reads via refresh_expire_time_on_get.
  • Graceful defaults – missing keys can automatically materialise from value_default, which also hydrates Redis so future reads stay warm.
  • Standalone eviction helpers – reusable FIFO and LRU dictionaries (sync, thread-safe, and asyncio-safe variants) for use outside the main cache.
  • Shipping today – published on PyPI (pip install flipcache) with zero external runtime dependencies beyond redis.

Architecture

FlipCache architecture

  1. Callers interact with the FlipCache mapping interface (__getitem__, __setitem__, __contains__, etc.).
  2. Reads hit the local OrderedDict. Misses trigger a Redis GET, decode the payload if needed, and repopulate the local tier without exceeding local_max.
  3. Writes accept either str/int values or any custom type that can be encoded before issuing a Redis SET with the configured TTL.
  4. Optional refresh_expire_time_on_get keeps frequently accessed keys alive by extending their Redis expiry during reads.
  5. Iterator and length helpers (__iter__, __len__) scan Redis keys with the cache prefix so the local tier can stay slim.

Implementation Highlights

  • Core cache logic lives in flipcache/flipcache.py; it asserts key/value types up front, injects a Redis protocol if one isn’t supplied, and exposes a refresh helper to keep items hot without touching the value.
  • Expiration presets (THREE_DAYS, FIVE_MINUTES, etc.) are bundled in flipcache/et.py, making it easy to keep TTL declarations consistent across services.
  • flipcache/fifo_dict.py and flipcache/lru_dict.py implement capped dictionaries that evict predictably; both now ship with thread-safe (RLock) and asyncio-safe (asyncio.Lock) variants.
  • Example scripts under examples/ cover JSON storage, expiring caches, benchmark runs, and custom codecs—handy templates when wiring the package into real projects.

Release Notes at a Glance

  • Dec 2024 – v1.1 (cb62678): relaxed the decode_responses=True requirement whenever value_type="custom", unlocking binary codecs such as pickle.
  • May 2025 – (09b9d72): introduced standalone FIFO/LRU containers and documented their usage.
  • May 2025 – (5abf49b): added thread-safe and asyncio-safe flavours of both eviction helpers so multi-threaded workers and async services can use them safely.

Example Usage

Python
from redis import Redis
from flipcache import FlipCache, et

cache = FlipCache(
    "analytics",
    local_max=256,
    expire_time=et.FIFTEEN_MINUTES,
    value_type="json",
    value_default={"hits": 0, "users": []},
    refresh_expire_time_on_get=True,
    redis_protocol=Redis(decode_responses=True),
)

def record(user_id: str) -> None:
    payload = cache[user_id]  # default materialises if key is missing
    payload["hits"] += 1
    cache[user_id] = payload

record("user-42")
print(cache["user-42"])  # {'hits': 1, 'users': []}

Need different eviction semantics? Swap in FIFODict, LRUDict, or their async/thread-safe counterparts for other caching scenarios without Redis.

Benchmarks

The bundled benchmark script (examples/benchmark.py) compares pure Redis access with FlipCache’s hybrid mode (1 000 keys, local tier capped at the same size):

ScenarioMean (s)Std Dev
redis_set0.2520.013
flipcache_set0.2420.003
redis_get22.9860.518
flipcache_get0.01720.000

In practice, the local layer makes read-heavy workloads two orders of magnitude faster while keeping data durable in Redis.

Current Status & Next Steps

  • The public PyPI release is tagged as 1.3; updating CHANGELOG.md beyond 1.1 will help users track the newer thread/async-safe additions.
  • Automated tests are still a blank slate (tests/__init__.py), so the next milestone is adding coverage around TTL refresh, custom codecs, and the eviction helpers.
  • I’m exploring async-native cache variants and broader Redis client support (e.g., redis.asyncio, aioredis) so FlipCache can slot into modern event loops without adapters.

Explore another build

More systems I am growing in the open.