CacheBolt is a new open source project gaining traction within the developer community for its unique approach: a reverse proxy written in Rust that provides intelligent caching both in RAM and in persistent cloud or local storage, without the need for Redis or additional databases. This solution is especially attractive for APIs, microservices, and web workloads seeking to accelerate content delivery, gain resilience, and reduce operational complexity.

What does CacheBolt offer?

  • In-memory RAM caching: HTTP cacheable responses are stored in memory using LRU (Least Recently Used) eviction policies, avoiding overload and automatically freeing space as the memory limit is approached.
  • Persistent object storage caching: In addition to RAM, data can be stored in cold storage such as Amazon S3, Google Cloud Storage, Azure Blob Storage, or local disk. This ensures cache survives restarts and can be shared between instances, making it easy to scale in Kubernetes or other cloud environments.
  • Recovery after failures: If the proxy restarts, it can rebuild the RAM cache from persistent storage.
  • No need for Redis or complex middleware: CacheBolt works as a transparent layer in front of your backend, with no changes required to your application.
  • TTL policies and latency tolerance: Configurable to serve from cache if the backend is slow or unavailable, improving user experience and resilience during incidents.
  • Intelligent LRU eviction: Unlike Redis, CacheBolt proactively frees up RAM space before reaching Out Of Memory (OOM).
  • Prometheus metrics: The entire system is fully observable in real time.
  • YAML-based configuration: Easy to adapt and deploy in any environment.
  • Open source (Apache 2.0): Free to use, modify, and contribute to.

How does it work?

CacheBolt acts as a reverse proxy between the client and the backend. Upon receiving a request:

  1. Checks in-memory cache: If the content is there, it serves it instantly.
  2. If not in RAM, checks persistent storage: If found, it loads into RAM and serves the content.
  3. If not found in any cache, forwards to backend: Stores the response in both caches if valid.
  4. If backend is too slow (according to configured rules), can serve from cache even with older data.

This approach combines the speed of RAM with the durability and resilience of persistent storage.

Use cases

  • High-concurrency REST and GraphQL APIs or with highly variable traffic.
  • File or image delivery with storage in S3, GCS, or Azure.
  • Microservices in Kubernetes: Multiple replicas can share the persistent cache.
  • Fallback during outages: Keeps serving responses even if the backend or infrastructure fails.
  • Accelerating apps without changing source code.

Advanced monitoring

CacheBolt exposes Prometheus metrics to monitor:

  • Cache hits and misses
  • Latencies
  • RAM and persistent storage usage
  • Backend failures, concurrency rejections, and much more

Example YAML configuration

app_id: my-service
max_concurrent_requests: 200
downstream_base_url: http://localhost:4000
downstream_timeout_secs: 5

storage_backend: s3  # or: gcs, azure, local
s3_bucket: my-cache-bucket
memory_eviction:
  threshold_percent: 90

latency_failover:
  default_max_latency_ms: 300
  path_rules:
    - pattern: "^/api/v1/products/.*"
      max_latency_ms: 150
Code language: HTTP (http)

Running and deploying

  • Binary: ./cachebolt --config ./config.yaml
  • Docker: docker run --rm -p 3000:3000 \ -v $(pwd)/config:/config \ -v $(pwd)/cache:/data \ -e GOOGLE_APPLICATION_CREDENTIALS=/config/adc.json \ ghcr.io/<your-org>/cachebolt:latest \ --config /config/config.yaml
  • Easy integration with AWS, Azure, GCP, or local storage.

Global cache invalidation

Need to invalidate all cache (RAM and cloud)? Just run:

curl -X DELETE "http://localhost:3000/cache?backend=true"
Code language: JavaScript (javascript)

In summary:
CacheBolt is a modern, simple, and high-performance option for adding persistent, fault-tolerant caching to your APIs or web apps. Forget exclusive reliance on Redis and complex middleware: with this tool, just deploy the proxy, configure it, and let it handle the rest.
The project is available on GitHub under the Apache 2.0 license:
👉 msalinas92/CacheBolt

Ideal for developers and DevOps teams seeking performance and simplicity in the caching layer.

Scroll to Top