Generators and async generators are powerful tools that enable lazy, on-demand data production โ values are computed only when requested. Generators underpin many advanced patterns: infinite sequences, custom iterables, pausable computations, and the async iteration protocol used by the Streams API, WebSockets, and server-sent events. In this lesson you will master generator functions, the iterator protocol, async generators with for await...of, and the real-world patterns that show why these features are invaluable for data streaming, pagination, and reactive event handling.
Generators and the Iterator Protocol
| Feature | Syntax / Behaviour |
|---|---|
| Generator function | function* name() { yield value; } |
| yield | Pauses the generator and returns a value to the caller |
| yield* | Delegate to another iterable โ flattens the values |
| Generator object | Calling the function returns an iterator โ not the values |
| next() | Resumes the generator โ returns { value, done } |
| next(val) | Passes a value INTO the generator โ becomes the result of yield |
| return(val) | Terminates the generator early โ done: true |
| throw(err) | Throws an error inside the generator at the yield point |
| Iterable | Object with [Symbol.iterator]() returning an iterator |
| Iterator | Object with next() returning { value, done } |
Async Generators and for await…of
| Feature | Syntax | Use For |
|---|---|---|
| Async generator | async function* gen() { yield await fetch(...) } |
Produce async values lazily |
| for await…of | for await (const val of asyncGen()) { } |
Consume async iterables |
| Async iterable | Object with [Symbol.asyncIterator]() |
Streams, WebSocket, SSE, paginated APIs |
| ReadableStream | Built-in async iterable in modern browsers | Streaming fetch responses |
Generator Use Cases
| Use Case | Pattern |
|---|---|
| Infinite sequence | function* counter(start=0) { while(true) yield start++; } |
| Custom range | function* range(start, end, step=1) { for(let i=start; i<end; i+=step) yield i; } |
| Lazy pipeline | Chain generators โ each stage transforms values on demand |
| Paginated API | Yield each page โ caller iterates until no more pages |
| Event stream | Async generator yields each message as it arrives |
next() is called. A generator function that yields 1,000,000 numbers does not compute all million values upfront. It computes each one on demand. This makes generators ideal for large datasets, infinite sequences, and expensive computations where you want to process items one at a time without loading everything into memory.[Symbol.asyncIterator] method can be used with for await...of. This includes ReadableStream (streaming fetch), EventSource (SSE), WebSocket message sequences, and any async generator. You can make your own classes async-iterable by implementing [Symbol.asyncIterator] as an async generator method.break out of a for await...of loop, the generator’s return() method is called automatically โ but underlying resources (open connections, event listeners, timers) may not be cleaned up unless you handle this in the generator’s try...finally block. Always use try/finally in generators that hold resources.Basic Example
// โโ Basic generator โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
function* counter(start = 0) {
while (true) {
yield start++;
}
}
const gen = counter(1);
console.log(gen.next()); // { value: 1, done: false }
console.log(gen.next()); // { value: 2, done: false }
console.log(gen.next()); // { value: 3, done: false }
// โโ Range generator โ lazy sequence โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
function* range(start, end, step = 1) {
for (let i = start; i < end; i += step) {
yield i;
}
}
console.log([...range(0, 10, 2)]); // [0, 2, 4, 6, 8]
for (const n of range(1, 6)) {
console.log(n); // 1, 2, 3, 4, 5
}
// โโ Finite generator with return value โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
function* processItems(items) {
let processed = 0;
for (const item of items) {
yield item; // pause after each item
processed++;
}
return processed; // return value on done
}
const processor = processItems(['a', 'b', 'c']);
console.log(processor.next()); // { value: 'a', done: false }
console.log(processor.next()); // { value: 'b', done: false }
console.log(processor.next()); // { value: 'c', done: false }
console.log(processor.next()); // { value: 3, done: true } โ return value
// โโ yield* โ delegate to another iterable โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
function* flatten(arr) {
for (const item of arr) {
if (Array.isArray(item)) yield* flatten(item); // recurse
else yield item;
}
}
console.log([...flatten([1, [2, [3, 4]], 5])]); // [1, 2, 3, 4, 5]
// โโ Custom iterable object โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
class InfiniteCounter {
constructor(start = 0, step = 1) {
this.current = start;
this.step = step;
}
[Symbol.iterator]() {
return {
current: this.current,
step: this.step,
next() {
return { value: (this.current += this.step), done: false };
}
};
}
*take(n) {
let count = 0;
for (const val of this) {
if (count++ >= n) return;
yield val;
}
}
}
const evens = new InfiniteCounter(0, 2);
console.log([...evens.take(5)]); // [2, 4, 6, 8, 10]
// โโ Async generator โ paginated API โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
async function* fetchAllPages(baseUrl) {
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await fetch(`${baseUrl}?page=${page}&limit=20`);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
const { data, meta } = await response.json();
yield* data; // yield each item from this page
hasMore = meta.page < meta.totalPages;
page++;
}
}
// Consume all users across all pages โ lazily
for await (const user of fetchAllPages('/api/users')) {
console.log(user.name); // processes each user as pages load
if (user.role === 'admin') break; // stop early โ no more fetches
}
// โโ Async generator โ Server-Sent Events โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
async function* streamEvents(url) {
const response = await fetch(url, {
headers: { Accept: 'text/event-stream' },
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
try {
while (true) {
const { value, done } = await reader.read();
if (done) break;
const text = decoder.decode(value, { stream: true });
const lines = text.split('
').filter(Boolean);
for (const line of lines) {
const data = line.replace(/^data: /, '').trim();
if (data === '[DONE]') return;
try { yield JSON.parse(data); } catch {}
}
}
} finally {
reader.cancel(); // cleanup when iteration stops
}
}
// Stream AI response tokens
for await (const chunk of streamEvents('/api/ai/stream')) {
output.textContent += chunk.delta;
}
How It Works
Step 1 โ Calling a Generator Function Returns an Iterator
Calling counter() does not run any code โ it creates and returns a generator object (which is an iterator). The body only runs when you call .next(). The generator runs until it hits a yield, pauses, and returns { value: yieldedValue, done: false }. When the function body ends (or hits return), done becomes true.
Step 2 โ yield Is a Two-Way Channel
yield value sends value out to the caller. But the expression const x = yield value means that when the caller calls gen.next(someValue), x inside the generator receives someValue. This two-way communication makes generators powerful for coroutines and cooperative multitasking.
Step 3 โ for…of and Spread Work With Any Iterable
Any object implementing the iterator protocol (or having [Symbol.iterator]) works with for...of, spread [...gen], destructuring, Array.from, and Promise.all. Generators automatically implement this protocol โ every generator function returns an iterable iterator.
Step 4 โ Async Generators Combine Generators and Promises
An async function* can both await Promises and yield values. Each yield in an async generator produces a Promise that resolves when the value is ready. for await...of automatically awaits each yielded Promise, making the consumption code read sequentially even though each iteration involves async I/O.
Step 5 โ try/finally Ensures Resource Cleanup
When a for...of or for await...of loop exits early (via break, return, or an exception), the iterator’s return() method is called. In a generator, this triggers the finally block. This is where you close open connections, cancel pending requests, and remove event listeners โ ensuring resources are always released regardless of how the loop exits.
Real-World Example: Streaming Data Pipeline
// streaming-pipeline.js
// Composable lazy pipeline stages
function* map(iterable, fn) {
for (const item of iterable) yield fn(item);
}
function* filter(iterable, fn) {
for (const item of iterable) { if (fn(item)) yield item; }
}
function* take(iterable, n) {
let count = 0;
for (const item of iterable) {
if (count++ >= n) return;
yield item;
}
}
function* chunk(iterable, size) {
let batch = [];
for (const item of iterable) {
batch.push(item);
if (batch.length === size) { yield batch; batch = []; }
}
if (batch.length) yield batch;
}
// Build lazy pipelines โ nothing computed until iterated
function* naturals(start = 1) {
while (true) yield start++;
}
const pipeline = take(
filter(
map(naturals(), n => n * n), // square each number
n => n % 2 === 0 // keep even squares
),
5 // take first 5
);
console.log([...pipeline]); // [4, 16, 36, 64, 100]
// Async pipeline โ transform a stream of JSON objects
async function* transformStream(asyncIterable) {
for await (const raw of asyncIterable) {
// Transform each item as it arrives
yield {
...raw,
fullName: `${raw.firstName} ${raw.lastName}`,
createdAt: new Date(raw.createdAt),
};
}
}
// Process in batches of 10
async function processBatches(iterable, batchSize) {
let batch = [];
for await (const item of iterable) {
batch.push(item);
if (batch.length === batchSize) {
await saveBatch(batch);
batch = [];
}
}
if (batch.length) await saveBatch(batch);
}
// Compose: fetch all pages โ transform โ batch save
await processBatches(
transformStream(fetchAllPages('/api/users')),
10
);
Common Mistakes
Mistake 1 โ Spreading an infinite generator
โ Wrong โ hangs or crashes with out-of-memory:
function* naturals() { let n = 0; while(true) yield n++; }
const all = [...naturals()]; // tries to collect infinite values!
โ Correct โ use take() or a for…of with a break condition:
const first10 = [...take(naturals(), 10)]; // safe
Mistake 2 โ Not using try/finally for resource cleanup
โ Wrong โ reader left open if loop breaks early:
async function* streamData(url) {
const reader = (await fetch(url)).body.getReader();
while (true) {
const { value, done } = await reader.read();
if (done) break;
yield value;
// If caller breaks early โ reader never cancelled!
}
}
โ Correct โ always clean up in finally:
async function* streamData(url) {
const reader = (await fetch(url)).body.getReader();
try {
while (true) {
const { value, done } = await reader.read();
if (done) break;
yield value;
}
} finally {
reader.cancel(); // always runs โ even on early break
}
}
Mistake 3 โ Using for…of on an async generator instead of for await…of
โ Wrong โ yields Promises, not resolved values:
for (const item of asyncGenerator()) {
console.log(item); // item is a Promise, not the value!
}
โ Correct โ use for await…of for async generators:
for await (const item of asyncGenerator()) {
console.log(item); // resolved value
}
Quick Reference
| Feature | Syntax |
|---|---|
| Generator function | function* gen() { yield 1; yield 2; } |
| Advance iterator | gen.next() โ { value, done } |
| Delegate | yield* otherIterable |
| Custom iterable | [Symbol.iterator]() { return { next() { ... } } } |
| Async generator | async function* gen() { yield await fetch(...) } |
| Consume async | for await (const x of asyncGen()) { } |
| Lazy pipeline | Chain generator functions โ each transforms on demand |
| Safe cleanup | try { yield ... } finally { resource.close() } |