npm i @unkey/cache

Batteries-included cache SDK for serverless applications.

Written by

Andreas Thomas

Published on

We are excited to introduce our latest library, @unkey/cache, designed to make caching in serverless applications easy and enjoyable.

The challenges of caching in Cloudflare Workers

Our journey with caching on Cloudflare Workers highlighted several challenges. The most significant issue was the lack of persistent memory, which meant that each request could start with a cold cache. Additionally, Cloudflare KV, while nice to use, proved to be too slow for our needs: The p99 was 560ms (source).

To mitigate these issues, we implemented a tiered caching strategy. By utilizing an in-memory store as the first tier and Cloudflare's CDN cache as the fallback, we achieved the best of both worlds, latency and decent hit rate.

Cache hit ratio

The ~27% memory hit rate might not be the most impressive, but it's free and does not add any latency. Unfortunately there's little we can do to increase it, as Cloudflare may evict a worker instance at any moment. However as traffic grows, the hit rate will increase too.

If a memory cache miss occurs, the Cloudflare cache will be checked, which adds some latency but is still faster than any other alternative we found.

Cache latency

This performed well, but the developer experience left something to be desired.

The problem with existing solutions

Caching is a common requirement in many applications, but traditional approaches often fall short. Here's a typical example of what developers have to deal with:

1const cache = new Some3rdPartyCache(...)
2
3type User = { email: string };
4
5let user = await cache.get("chronark") as User | undefined | null;
6if (!user) {
7  user = await db.query.users.findFirst({
8    where: (table, { eq }) => eq(table.id, "chronark"),
9  });
10  await cache.set("chronark", user, Date.now() + 60_000)
11}
12
13// use user

@unkey/cache abstracts all the boilerplate away and gives you a clean API that is fully type-safe:

1const user = await cache.user.swr("chronark", async (id) => {
2  return await db.query.users.findFirst({
3    where: (table, { eq }) => eq(table.id, id),
4  });
5});

Key features

The "u" in "unkey" stands for "batteries included"! (English may not be my first language)

  • E2E Typesafe: Fully type-safe, clean and intuitive API with intellisense autocomplete.
  • Tiered Cache: Chain multiple caches together for fast and reliable caching.
  • Stale-While-Revalidate: Most 3rd party caches support setting a time-to-live, but you needed to handle SWR yourself, until now. Just configure fresh and stale times and let the cache handle the rest.
  • Metrics Collection: Middleware for gathering metrics to monitor and debug your cache usage.
  • Encryption: Middleware for automatic encryption of cache values, protecting your data at rest.
  • Composable Design: Mix and match primitives to build exactly what you need.

Getting Started

Install @unkey/cache:

1npm install @unkey/cache

Basic cache

1import { createCache, DefaultStatefulContext, Namespace } from "@unkey/cache";
2import { MemoryStore } from "@unkey/cache/stores";
3
4/**
5 * Let's say we have two types, `User` and `Project`:
6 */
7type User = { id: string; email: string };
8type Project = { name: string; description: string };
9
10/**
11 * Next we'll be creating a store. A store is really just a small abstraction 
12 * over a key-value database.
13 */
14const memory = new MemoryStore({ persistentMap: new Map() });
15
16/**
17 * We'll create a cache instance with our two types, `User` and `Project`, and
18 * configure the cache to use the memory store. We'll also set the `fresh` and
19 * `stale` times for each type.
20 * The `ctx` object is provided in the request handler and allows us to do some
21 * background work without blocking the request.
22 */
23const cache = createCache({ 
24    user: new Namespace<User>(ctx, {
25      stores: [memory],
26      fresh: 60_000,
27      stale: 300_000,
28    }),
29    project: new Namespace<Project>(ctx, {
30      stores: [memory],
31      fresh: 300_000,
32      stale: 900_000,
33    })
34});
35
36/**
37 * That's it! Now we can use the cache like this:
38 */
39await cache.user.set("userId", { id: "userId", email: "user@email.com" });
40const user = await cache.user.get("userId");
41console.log(user);
42
43
44/**
45 * To make full use of the SWR capabilities, we can use the `swr` method, which
46 * will automatically handle the cache misses and cache updates for us.
47 * This will check all stores for the value, and if it's not found, it will 
48 * call the provided function to get the value and cache it automatically.
49 */
50const user = await cache.user.swr("userId", async () => {
51  return await database.get(...)
52});

Tiered caching

Tiered caching is a powerful feature that allows you to chain multiple caches together. This is useful when you want to use a fast, in-memory cache as the first tier and a slower, more persistent cache as the second tier.

1import { createCache, DefaultStatefulContext, Namespace } from "@unkey/cache";
2import { CloudflareStore, MemoryStore } from "@unkey/cache/stores";
3
4type User = { id: string; email: string };
5
6const memory = new MemoryStore({ persistentMap: new Map() });
7const cloudflare = new CloudflareStore({
8  domain: "cache.unkey.dev",
9  zoneId: process.env.CLOUDFLARE_ZONE_ID!,
10  cloudflareApiKey: process.env.CLOUDFLARE_API_KEY!,
11});
12
13const cache = createCache({ 
14  user: new Namespace<User>(ctx, {
15    // memory is checked first, then cloudflare if memory misses
16    stores: [memory, cloudflare],
17    fresh: 60_000,
18    stale: 300_000,
19  })
20});
21
22await cache.user.set("userId", { id: "userId", email: "user@email.com" });
23const user = await cache.user.get("userId");
24console.log(user);

Middleware

There are two middlewares available out of the box:

  • Metrics: Collects and forwards metrics on cache hits, misses, latency and evictions.

    import { withMetrics } from "@unkey/cache/middleware";

  • Encryption: Automatically encrypts and decrypts cache values.

    import { withEncryption } from "@unkey/cache/middleware";

Please refer to the documentation for more information on how to use middlewares.

Conclusion

At launch we ship with a memory store and a Cloudflare store, but everything is built to be easily extensible. We can add more stores and middlewares as needed, let us know what you'd want to see! Whether you're dealing with the limitations of serverless functions or simply need a nice caching abstraction, @unkey/cache has you covered.

As usual, everything is open source, check out our GitHub repository and our documentation for more information. We can't wait to see what you build with it!

Protect your API.
Start today.

2500 verifications and 100K successful rate‑limited requests per month. No CC required.