Next.js Discord

Discord Forum

How to Cache Redis GETs Within a Single Vercel Fluid Instance (Next.js 15 + Upstash)

Unanswered
Snowshoe posted this in #help-forum
Open in Discord
SnowshoeOP
Hi! I’m developing a Next.js 15 app, using Upstash Redis for caching and deploying on Vercel with Fluid Compute.

My goal is to cache and dedublicate all Redis GET responses within a single Fluid lambda container, so that any serverless function running in that container can check a shared variable and avoid calling Redis again if the data is already fetched.

I want:

A global in-memory cache inside the Fluid container

Deduplication across concurrent functions in the same container

To only call redis.get() if we haven't already fetched the key

Deduplication and every possible optimization is important to me.
If you’ve done anything similar or have suggestions on best practices, I’d really appreciate the help!

4 Replies

SnowshoeOP
import { type Redis } from "@upstash/redis";

const memoryDedupe = new Map<string, Promise<unknown>>();
const redisDedupe = new Map<string, Promise<unknown>>(); // Dedupe Redis GETs
const redisCache = new Map<string, unknown>(); // Request-scoped Redis cache
export function createCachedFetcher(redis: Redis) {
  return async function cachedFetch<T>(
    key: string,
    fetcher: () => Promise<T>,
    ttl = 900, // TTL for cached data in seconds
    lockTtl = 5 // TTL for Redis lock in seconds
  ): Promise<T> {
    if (redisCache.has(key)) {
      return redisCache.get(key) as T;
    }

    // Layer 2: Deduplicated Redis GET
    if (!redisDedupe.has(key)) {
      redisDedupe.set(key, redis.get<T>(key));
    }
   
    const cached = (await redisDedupe.get(key)) as T;

    if (cached) {
      redisCache.set(key, cached);
      return cached;
    }

    // Layer 3: In-flight request deduplication
    if (memoryDedupe.has(key)) {
      return memoryDedupe.get(key) as Promise<T>;
    }

    // Layer 4: Distributed lock with Redis
    const lockKey = `lock:${key}`;
    const hasLock = await redis.set(lockKey, "1", { nx: true, ex: lockTtl });

    if (hasLock === "OK") {
      const promise = fetcher()
        .then(async (data) => {
          await redis.set(key, data, { ex: ttl });
          redisCache.set(key, data); // Populate request cache
          return data;
        })
        .finally(() => {
          memoryDedupe.delete(key);
          redisDedupe.delete(key); // 🔥 clear it after it's used
          redis.del(lockKey);
        });

      memoryDedupe.set(key, promise);
      return promise;
    } else {
      await new Promise((resolve) => setTimeout(resolve, 500));

      // Final check after backoff
      const retry = await redis.get<T>(key);
      if (retry) {
        redisCache.set(key, retry);
        return retry;
      }

      const promise = fetcher() // Fallback safe fetch
        .then(async (data) => {
          await redis.set(key, data, { ex: ttl });
          return data;
        });

      memoryDedupe.set(key, promise);
      return promise;
    }
  };
}
why don't you directly cache using nextjs?
SnowshoeOP
i wanted to opt out of next js cache since it felt like a black box