Structured Concurrency in JavaScript / Node.js: Harnessing AbortController & Async Generators
Structured Concurrency in JavaScript / Node.js
Using AbortController and async generators to keep parallel work predictable and leak‑free
Introduction
JavaScript’s async model is famously “run‑to‑completion” – a promise that starts will keep running until it resolves or rejects, even if the surrounding request has already finished. In a server that handles thousands of concurrent requests, that behaviour can lead to resource leaks, orphaned I/O, and hard‑to‑debug race conditions.
Structured concurrency is a discipline that forces every asynchronous task to be bound to a well‑defined scope. When the scope ends—whether because of success, failure, or explicit cancellation—all child tasks are automatically terminated. The idea originates from languages like Go and Kotlin, but the pattern can be expressed in plain JavaScript with two modern primitives:
| Primitive | Role in the pattern |
|---|---|
AbortController |
Propagates cancellation signals across APIs that accept an AbortSignal. |
| Async generators | Define a structured block whose finally clause runs when the consumer stops iterating, giving us a deterministic cleanup point. |
This article shows how to combine them into a reusable “structured concurrency” helper, then walks through two realistic Node.js use‑cases.
1. The core idea of structured concurrency
In a structured model:
- Parent → Child relationship – Every async operation is started by a parent task.
- Lifetime inheritance – Children cannot outlive their parent. When the parent finishes, all children are cancelled.
- Deterministic cleanup – Resources (sockets, file handles, timers) are released exactly once, in a predictable order.
Contrast this with the “fire‑and‑forget” style that is common in JavaScript:
// Bad: fire‑and‑forget HTTP request
fetch(url).then(r => r.json()).then(data => console.log(data));
If the surrounding request handler returns early, the fetch continues in the background, consuming a socket and possibly writing to a logger that is no longer relevant.
Structured concurrency eliminates that gap.
2. AbortController – the cancellation signal
AbortController is part of the WHATWG Fetch spec but has been adopted by many Node.js APIs (fs.promises, http, timers/promises, third‑party libraries). It works like this:
const controller = new AbortController();
const { signal } = controller;
// Pass the signal to any API that supports it
fetch(url, { signal })
.then(r => r.text())
.catch(err => {
if (err.name === 'AbortError') console.log('Request cancelled');
else throw err;
});
// Somewhere else…
controller.abort(); // triggers AbortError in all listeners
The signal can be shared among many operations, making it a natural “cancellation token” for a whole group of tasks.
3. Async generators as a structured block
An async generator can be used as a scope because its finally block runs when the consumer stops iterating—whether because the loop completed, an exception was thrown, or the consumer called return().
async function* scopedTasks(controller) {
try {
// Yield control back to the caller; the body runs concurrently
yield;
} finally {
// This runs exactly once when the scope ends
controller.abort();
}
}
When the caller consumes the generator with for await … of, the finally clause guarantees that the AbortController is signaled exactly when the loop exits.
4. Building a reusable helper
Below is a compact library that gives us a structured concurrency context:
// structured.js
export async function withStructuredConcurrency(fn) {
const controller = new AbortController();
// The async generator that defines the scope
async function* scope() {
try {
// The caller can await inside the generator
yield;
} finally {
// Ensure cancellation on scope exit
controller.abort();
}
}
// Run the user function inside the generator
const gen = scope();
// Advance to the first yield – now the scope is active
await gen.next();
try {
// Pass the signal so the user can wire it into I/O
return await fn(controller.signal);
} finally {
// Explicitly close the generator to trigger finally
await gen.return();
}
}
Usage pattern:
import { withStructuredConcurrency } from './structured.js';
await withStructuredConcurrency(async signal => {
// All async work here receives the same `signal`
const [a, b] = await Promise.all([
fetch('https://api.example.com/a', { signal }).then(r => r.json()),
fetch('https://api.example.com/b', { signal }).then(r => r.json()),
]);
console.log(a, b);
});
If any promise rejects, the finally block aborts the controller, cancelling the other request automatically.
5. Real‑world example 1 – Parallel API aggregation
Imagine a microservice that needs to call three downstream services to build a response. The latency budget is 200 ms; if any call exceeds that, the whole request should be aborted.
import { withStructuredConcurrency } from './structured.js';
import { setTimeout as delay } from 'timers/promises';
async function aggregate(req, res) {
try {
const result = await withStructuredConcurrency(async signal => {
// Helper that respects the shared signal
const fetchJson = (url) => fetch(url, { signal }).then(r => r.json());
// Start three calls in parallel
const [profile, orders, recommendations] = await Promise.all([
fetchJson('https://users.internal/profile/' + req.userId),
fetchJson('https://orders.internal/recent/' + req.userId),
fetchJson('https://recs.internal/personal/' + req.userId),
]);
// Simulate some CPU‑bound work that also respects cancellation
await delay(50, { signal });
return { profile, orders, recommendations };
});
res.json(result);
} catch (err) {
if (err.name === 'AbortError') {
res.status(504).json({ error: 'Upstream timeout' });
} else {
res.status(500).json({ error: err.message });
}
}
}
What we gain
- Automatic cancellation – If
profilefails,ordersandrecommendationsare aborted immediately. - Single source of truth – The
signalis passed to every I/O call, no hidden timers. - Clear lifetime – The request handler cannot accidentally leave a dangling fetch after it has responded.
6. Real‑world example 2 – Streaming file processing
Node.js streams are already “lazy”, but when you combine a file read with a remote upload you still need to guarantee that both ends close together.
import { createReadStream } from 'fs';
import { pipeline } from 'stream/promises';
import { withStructuredConcurrency } from './structured.js';
import { request } from 'https';
async function uploadFile(filePath, uploadUrl) {
await withStructuredConcurrency(async signal => {
const fileStream = createReadStream(filePath, { highWaterMark: 64 * 1024 });
// Wrap the HTTPS request in a promise that respects the signal
const reqPromise = new Promise((resolve, reject) => {
const req = request(uploadUrl, { method: 'PUT', signal }, res => {
if (res.statusCode >= 200 && res.statusCode < 300) resolve();
else reject(new Error(`Upload failed: ${res.statusCode}`));
});
req.on('error', reject);
resolve(req);
});
const req = await reqPromise;
// Pipe the file into the request; pipeline will abort on signal
await pipeline(
fileStream,
// Transform that checks the signal on each chunk (optional)
async function* (source) {
for await (const chunk of source) {
if (signal.aborted) throw new DOMException('Aborted', 'AbortError');
yield chunk;
}
},
req
);
});
}
If the client disconnects or a timeout fires, the outer withStructuredConcurrency aborts the signal, causing the pipeline to tear down the file stream and the HTTP request instantly—no half‑written files left on the server.
7. Error handling & cancellation propagation
Structured concurrency does not hide errors; it merely propagates them. Inside the scoped function you can:
- Catch and re‑throw – to add context.
- Inspect
signal.aborted– to decide whether to perform extra cleanup. - Use
AbortSignal.throwIfAborted()(Node ≥ 20) for concise checks.
await withStructuredConcurrency(async signal => {
try {
await someAsyncOp({ signal });
} catch (e) {
// Add domain‑specific info, then rethrow
throw new Error(`someAsyncOp failed: ${e.message}`);
}
});
When a child throws, the helper aborts the controller first, then re‑throws the original error, ensuring that all siblings stop before the error bubbles up.
8. Testing and debugging
Unit testing
Because the helper returns a promise, you can test cancellation paths with Jest or Vitest:
test('cancels sibling when one fails', async () => {
const slow = jest.fn(() => new Promise(r => setTimeout(r, 200)));
const fast = jest.fn(() => Promise.reject(new Error('boom')));
await expect(
withStructuredConcurrency(async signal => {
await Promise.all([slow({ signal }), fast({ signal })]);
})
).rejects.toThrow('boom');
// Verify that `slow` was aborted (Node 20+)
expect(slow).toHaveBeenCalled();
});
Debugging
AbortSignal exposes a reason property (Node ≥ 20) that can carry a custom error object, making it easier to trace why a cancellation happened.
controller.abort(new Error('Request timed out'));
console.log(signal.reason.message); // "Request timed out"
9. Best practices checklist
| ✅ | Recommendation |
|---|---|
| Scope everything | Wrap each request/transaction in a withStructuredConcurrency block. |
| Pass the signal everywhere | Any API that accepts an AbortSignal should receive the same one. |
| Prefer async generators for custom resources | Use finally to close DB connections, release mutexes, or stop timers. |
| Avoid hidden fire‑and‑forget | If you start a promise without awaiting it, make sure it also receives the shared signal. |
| Document cancellation semantics | Consumers need to know whether a function respects the signal. |
| Set reasonable timeouts | Combine AbortController with setTimeout to enforce upper bounds. |
10. Conclusion
Structured concurrency brings the same safety guarantees that languages like Go enjoy to the JavaScript ecosystem, without requiring a new runtime. By pairing AbortController with async generators, you get:
- Deterministic lifetimes – no stray network calls or file handles.
- Automatic cancellation – a single
abort()call shuts down an entire tree of work. - Cleaner error handling – failures are propagated in a predictable order.
Adopt the pattern in your next Node.js service, and you’ll find that reasoning about parallelism becomes as straightforward as reading a well‑indented block of code.
Happy structuring!
Member discussion