7 min read

Type‑Safe Memoization in JavaScript/TypeScript: Reusable, Cache‑Aware Functions that Scale

Learn practical, type‑safe patterns for memoizing pure functions and async pipelines, with reusable helpers that keep your caches predictable.
Type‑Safe Memoization in JavaScript/TypeScript: Reusable, Cache‑Aware Functions that Scale

Introduction

Memoization – storing the result of a function call so that identical future calls can be answered instantly – is a classic performance trick. In JavaScript it’s often introduced with a quick Map or a third‑party library, but when you add TypeScript to the mix the story becomes richer: you can guarantee that the cache keys and values line up with the function’s signature, catch mistakes at compile time, and compose memoized utilities across sync and async boundaries.

This article walks through:

  1. Why type safety matters for memoization
  2. A generic, reusable memoize helper for pure functions
  3. Cache‑aware wrappers for async functions (including cancellation and stale‑while‑revalidate)
  4. Choosing the right cache implementation (Map, WeakMap, LRU, etc.)
  5. Real‑world examples – data‑fetching layer, expensive UI calculations, and a simple GraphQL‑like resolver cache.

By the end you’ll have a small toolkit you can drop into any codebase and a mental model for extending it without breaking type safety.


1. The Type‑Safety Problem

Consider a naïve memoizer:

const memo = (fn) => {
  const cache = new Map();
  return (...args) => {
    const key = JSON.stringify(args);
    if (cache.has(key)) return cache.get(key);
    const result = fn(...args);
    cache.set(key, result);
    return result;
  };
};

The implementation works, but the TypeScript compiler sees fn and the returned wrapper as any. Mistakes such as passing the wrong number of arguments, using non‑serializable values as keys, or returning a different type than expected slip through.

A type‑safe memoizer must:

  • Preserve the original function’s parameter types (...args: Parameters<F>).
  • Preserve the return type (ReturnType<F>), including Promise<T> for async functions.
  • Constrain the cache key type to something that can be reliably compared (primitive, bigint, or a stable identifier).

The following sections build a solution that satisfies these constraints.


2. A Generic Sync Memoizer

2.1. Defining a CacheKey Strategy

Instead of stringifying arguments (expensive and fragile), we let callers provide a key selector – a function that maps the argument list to a stable cache key. The selector can return a primitive, a tuple, or even a custom object that implements valueOf/toString.

type KeySelector<A extends unknown[]> = (...args: A) => unknown;

2.2. The Helper

/**
 * Memoizes a pure function with a user‑supplied key selector.
 *
 * @param fn          The pure function to memoize.
 * @param keySelector Maps the call arguments to a cache key.
 * @param cache       Optional cache implementation (defaults to Map).
 */
function memoize<F extends (...args: any[]) => any>(
  fn: F,
  keySelector: KeySelector<Parameters<F>> = (...args) => args,
  cache: Map<unknown, ReturnType<F>> = new Map()
): F {
  return ((...args: Parameters<F>): ReturnType<F> => {
    const key = keySelector(...args);
    if (cache.has(key)) {
      return cache.get(key)!;
    }
    const result = fn(...args);
    cache.set(key, result);
    return result;
  }) as F;
}

Why this is type‑safe

  • F is generic, so the wrapper’s signature exactly matches the original.
  • Parameters<F> and ReturnType<F> keep the argument and return types in sync.
  • The cache is typed as Map<unknown, ReturnType<F>>; the key selector can be any type that Map can compare (primitive or object reference).

2.3. Usage Example – Pure Math

// Expensive Fibonacci (inefficient on purpose)
function fib(n: number): number {
  return n <= 1 ? n : fib(n - 1) + fib(n - 2);
}

// Memoized version – key is the number itself (primitive)
const memoFib = memoize(fib, (n) => n);

console.time('first');
console.log(memoFib(40)); // 102334155 (computes)
console.timeEnd('first');

console.time('second');
console.log(memoFib(40)); // 102334155 (cached)
console.timeEnd('second');

Both calls compile with the exact same signature ((n: number) => number), and any attempt to call memoFib('40') would be flagged by TypeScript.


3. Async Memoization with Stale‑While‑Revalidate

Memoizing async functions brings two extra concerns:

  1. Promise deduplication – multiple callers should share the same in‑flight promise.
  2. Cache invalidation – you often want to return stale data while a fresh request is performed (the “stale‑while‑revalidate” pattern).

3.1. Async Wrapper API

interface AsyncCacheEntry<T> {
  /** Resolved value (may be stale) */
  value?: T;
  /** In‑flight promise, if a request is ongoing */
  promise?: Promise<T>;
  /** Timestamp of the last successful fetch */
  ts: number;
}

/**
 * Memoizes an async function with optional TTL and background refresh.
 *
 * @param fn          Async function to memoize.
 * @param keySelector Maps arguments to a cache key.
 * @param options     TTL in ms and whether to refresh in background.
 */
function memoizeAsync<F extends (...args: any[]) => Promise<any>>(
  fn: F,
  keySelector: KeySelector<Parameters<F>> = (...args) => args,
  options: { ttl?: number; revalidate?: boolean } = {}
): F {
  const cache = new Map<unknown, AsyncCacheEntry<Awaited<ReturnType<F>>>>();

  return ((...args: Parameters<F>): ReturnType<F> => {
    const key = keySelector(...args);
    const now = Date.now();
    let entry = cache.get(key);

    // Return cached value if fresh
    if (entry && now - entry.ts < (options.ttl ?? Infinity) && entry.value !== undefined) {
      // Optionally kick off background refresh
      if (options.revalidate && !entry.promise) {
        entry.promise = fn(...args).then((v) => {
          entry!.value = v;
          entry!.ts = Date.now();
          entry!.promise = undefined;
          return v;
        });
      }
      return Promise.resolve(entry.value) as ReturnType<F>;
    }

    // No fresh value – reuse in‑flight promise or start a new one
    if (!entry) {
      entry = { ts: 0 };
      cache.set(key, entry);
    }
    if (!entry.promise) {
      entry.promise = fn(...args).then((v) => {
        entry!.value = v;
        entry!.ts = Date.now();
        entry!.promise = undefined;
        return v;
      });
    }
    return entry.promise as ReturnType<F>;
  }) as F;
}

3.2. Real‑World Example – Remote User Profile

type User = { id: string; name: string; avatar: string };

async function fetchUser(id: string): Promise<User> {
  const resp = await fetch(`https://api.example.com/users/${id}`);
  if (!resp.ok) throw new Error('Network error');
  return resp.json();
}

// Cache for 5 minutes, refresh in background on stale reads
const getUser = memoizeAsync(fetchUser, (id) => id, {
  ttl: 5 * 60 * 1000,
  revalidate: true,
});

// Component logic (pseudo)
async function renderUser(id: string) {
  const user = await getUser(id); // Fast if cached, otherwise network
  console.log(user.name);
}

If the data is younger than 5 minutes, getUser returns the cached User instantly. When the data becomes stale, the call still resolves quickly with the stale value, while a background request updates the cache for the next callers.

The TypeScript compiler guarantees that getUser can only be called with a string and returns a Promise<User>.


4. Picking the Right Cache Implementation

4.1. Map vs. WeakMap

  • Map – works with any primitive or object key, but entries stay alive until manually deleted. Good for long‑lived caches (e.g., API responses).
  • WeakMap – keys must be objects; entries are GC‑collected when the key object is no longer referenced elsewhere. Ideal for memoizing methods that receive class instances or DOM nodes.
function memoizeMethod<T extends object, A extends any[], R>(
  fn: (this: T, ...args: A) => R,
  cache = new WeakMap<T, Map<string, R>>()
) {
  return function (this: T, ...args: A): R {
    let inner = cache.get(this);
    if (!inner) {
      inner = new Map();
      cache.set(this, inner);
    }
    const key = JSON.stringify(args);
    if (inner.has(key)) return inner.get(key)!;
    const result = fn.apply(this, args);
    inner.set(key, result);
    return result;
  };
}

4.2. LRU & Size‑Bounded Caches

For memory‑constrained environments (e.g., serverless functions), an LRU cache prevents unbounded growth. Libraries such as lru-cache already expose generic types, so you can wrap them with the same memoize signature:

import LRU from 'lru-cache';

function memoizeWithLRU<F extends (...args: any[]) => any>(
  fn: F,
  options: LRU.Options<unknown, ReturnType<F>>,
  keySelector: KeySelector<Parameters<F>> = (...args) => args
): F {
  const cache = new LRU<unknown, ReturnType<F>>(options);
  return memoize(fn, keySelector, cache as unknown as Map<unknown, ReturnType<F>>);
}

Now you have a size‑aware, type‑safe memoizer ready for production.


5. Composing Memoizers

Because each helper returns a function with the exact original signature, they can be composed:

// 1️⃣ Pure, sync heavy calculation
function heavyCalc(a: number, b: number): number {
  // … CPU‑intensive work …
  return a ** b;
}

// 2️⃣ Async fetch that depends on the calculation result
async function fetchData(key: number): Promise<string> {
  const resp = await fetch(`https://api.example.com/data/${key}`);
  return resp.text();
}

// Memoize both layers
const memoHeavyCalc = memoize(heavyCalc);
const memoFetchData = memoizeAsync(fetchData, (k) => k, { ttl: 60_000 });

// Combined usage
async function combined(a: number, b: number): Promise<string> {
  const calc = memoHeavyCalc(a, b);
  return memoFetchData(calc);
}

All three functions keep their original type signatures, making the composition seamless and type‑checked.


6. Testing Memoized Functions

  • Unit test cache hits – expose the internal cache via an optional parameter or use a spy on Map.prototype.has/set.
  • Verify async deduplication – call the memoized async function twice without awaiting, assert that the underlying fetch runs once.
  • Check TTL behavior – manipulate Date.now with jest.useFakeTimers() and confirm stale‑while‑revalidate works as expected.
test('memoizeAsync dedupes in‑flight calls', async () => {
  const fetcher = jest.fn(async (id: string) => ({ id }));
  const get = memoizeAsync(fetcher, (id) => id);

  const p1 = get('x');
  const p2 = get('x');

  expect(fetcher).toHaveBeenCalledTimes(1);
  await expect(Promise.all([p1, p2])).resolves.toEqual([{ id: 'x' }, { id: 'x' }]);
});

Testing stays straightforward because the public API never leaks implementation details.


7. When Not to Memoize

Memoization shines for pure, deterministic functions where the cost of recomputation outweighs the memory overhead. Avoid it for:

  • Functions with side effects (e.g., logging, random number generation).
  • Highly volatile data where TTL would be minuscule – the cache churn erodes benefits.
  • Large argument objects that lack a stable identifier; using the whole object as a key can cause memory leaks.

In those cases, consider explicit caching (e.g., a service‑layer cache) instead of a generic memoizer.


8. Summary Checklist

Decision point
Signature safety Use Parameters<F> / ReturnType<F> generics.
Key strategy Provide a keySelector that returns a stable primitive or object reference.
Cache choice Map for general use, WeakMap for object‑bound lifetimes, LRU for bounded memory.
Sync vs. async memoize for pure functions; memoizeAsync adds promise deduplication and optional TTL/revalidation.
Invalidation TTL, manual cache.delete(key), or external invalidation hooks.
Testing Spy on underlying function, fake timers for TTL, assert deduplication.

By adhering to these guidelines, you gain the performance benefits of memoization without sacrificing the robustness that TypeScript provides.


9. Final Thoughts

Memoization is more than a quick win; it’s a design pattern that, when typed correctly, becomes a reliable building block for larger systems—data layers, UI calculations, and even serverless edge functions. The generic helpers presented here are deliberately minimal, letting you swap in sophisticated cache back‑ends, add metrics, or integrate with observability tools while keeping the same type‑safe contract.

Give them a try in your next project, profile the impact, and iterate. The payoff is a codebase that feels fast, predictable, and, most importantly, type‑safe from the first call to the last cached response.