FlipCache
Why I Built It
- While integrating data-heavy Python services, I wanted a cache that felt as quick as an in-memory dict but still survived process restarts.
- Redis alone was reliable but latency spiked for frequently accessed, short-lived data; pure in-memory caches were fast but volatile.
- FlipCache gives me the hybrid behaviour I wanted: hot data stays in the process, everything else lives in Redis with namespaced keys and TTLs.
What It Offers
- Two-tier storage – an
OrderedDictfront cache guarded bylocal_max, with Redis acting as the durable backing store. - Type-aware serialization – built-in support for
str,int, and JSON payloads, plus plug-in encoders/decoders whenvalue_type="custom". - TTL controls – configure expirations per cache instance and optionally refresh them on reads via
refresh_expire_time_on_get. - Graceful defaults – missing keys can automatically materialise from
value_default, which also hydrates Redis so future reads stay warm. - Standalone eviction helpers – reusable FIFO and LRU dictionaries (sync, thread-safe, and asyncio-safe variants) for use outside the main cache.
- Shipping today – published on PyPI (
pip install flipcache) with zero external runtime dependencies beyondredis.
Architecture
- Callers interact with the
FlipCachemapping interface (__getitem__,__setitem__,__contains__, etc.). - Reads hit the local
OrderedDict. Misses trigger a RedisGET, decode the payload if needed, and repopulate the local tier without exceedinglocal_max. - Writes accept either
str/intvalues or any custom type that can be encoded before issuing a RedisSETwith the configured TTL. - Optional
refresh_expire_time_on_getkeeps frequently accessed keys alive by extending their Redis expiry during reads. - Iterator and length helpers (
__iter__,__len__) scan Redis keys with the cache prefix so the local tier can stay slim.
Implementation Highlights
- Core cache logic lives in
flipcache/flipcache.py; it asserts key/value types up front, injects a Redis protocol if one isn’t supplied, and exposes arefreshhelper to keep items hot without touching the value. - Expiration presets (
THREE_DAYS,FIVE_MINUTES, etc.) are bundled inflipcache/et.py, making it easy to keep TTL declarations consistent across services. flipcache/fifo_dict.pyandflipcache/lru_dict.pyimplement capped dictionaries that evict predictably; both now ship with thread-safe (RLock) and asyncio-safe (asyncio.Lock) variants.- Example scripts under
examples/cover JSON storage, expiring caches, benchmark runs, and custom codecs—handy templates when wiring the package into real projects.
Release Notes at a Glance
- Dec 2024 – v1.1 (
cb62678): relaxed thedecode_responses=Truerequirement whenevervalue_type="custom", unlocking binary codecs such aspickle. - May 2025 – (
09b9d72): introduced standalone FIFO/LRU containers and documented their usage. - May 2025 – (
5abf49b): added thread-safe and asyncio-safe flavours of both eviction helpers so multi-threaded workers and async services can use them safely.
Example Usage
from redis import Redis
from flipcache import FlipCache, et
cache = FlipCache(
"analytics",
local_max=256,
expire_time=et.FIFTEEN_MINUTES,
value_type="json",
value_default={"hits": 0, "users": []},
refresh_expire_time_on_get=True,
redis_protocol=Redis(decode_responses=True),
)
def record(user_id: str) -> None:
payload = cache[user_id] # default materialises if key is missing
payload["hits"] += 1
cache[user_id] = payload
record("user-42")
print(cache["user-42"]) # {'hits': 1, 'users': []}
Need different eviction semantics? Swap in FIFODict, LRUDict, or their async/thread-safe counterparts for other caching scenarios without Redis.
Benchmarks
The bundled benchmark script (examples/benchmark.py) compares pure Redis access with FlipCache’s hybrid mode (1 000 keys, local tier capped at the same size):
| Scenario | Mean (s) | Std Dev |
|---|---|---|
redis_set | 0.252 | 0.013 |
flipcache_set | 0.242 | 0.003 |
redis_get | 22.986 | 0.518 |
flipcache_get | 0.0172 | 0.000 |
In practice, the local layer makes read-heavy workloads two orders of magnitude faster while keeping data durable in Redis.
Current Status & Next Steps
- The public PyPI release is tagged as
1.3; updatingCHANGELOG.mdbeyond1.1will help users track the newer thread/async-safe additions. - Automated tests are still a blank slate (
tests/__init__.py), so the next milestone is adding coverage around TTL refresh, custom codecs, and the eviction helpers. - I’m exploring async-native cache variants and broader Redis client support (e.g.,
redis.asyncio,aioredis) so FlipCache can slot into modern event loops without adapters.

