A generator function is a function that uses yield instead of return. When called, it returns a generator object — an iterator — without executing any code yet. Each time next() is called on the generator, execution resumes from where it was suspended (the last yield) and runs until it hits the next yield. This lazy execution model is what makes generators memory-efficient: a generator that produces a million values never stores them all in memory — it produces them one at a time on demand. FastAPI’s StreamingResponse, SQLAlchemy’s yield_per(), and many async patterns all rely on this lazy evaluation model.
Generator Functions with yield
# A generator function: uses yield instead of return
def count_up(start: int, stop: int):
current = start
while current <= stop:
yield current # suspend here, return value to caller
current += 1 # resumes here on next next() call
# Calling the function returns a generator object — no code runs yet
gen = count_up(1, 5)
print(type(gen)) # <class 'generator'>
# Values produced lazily on each next() call
print(next(gen)) # 1
print(next(gen)) # 2
# Or iterate with for loop
for n in count_up(1, 5):
print(n) # 1, 2, 3, 4, 5
# ── Multiple yields in one function ───────────────────────────────────────────
def three_greetings():
yield "Hello" # suspend after yielding
yield "Hi" # suspend after yielding
yield "Hey" # last value
greets = list(three_greetings()) # ["Hello", "Hi", "Hey"]
# ── Generator with return — signals StopIteration ────────────────────────────
def limited_range(n: int):
for i in range(n):
if i == 3:
return # StopIteration — generator done
yield i
list(limited_range(10)) # [0, 1, 2] — stops at return
yield statement. The presence of yield transforms the function into a generator factory — calling it returns a generator object without running any code in the body. The body runs lazily as you call next(). This is fundamentally different from returning a list — a list is computed all at once; a generator computes each value only when requested.def counter(n=0): while True: yield n; n += 1 never runs out of values. You consume as many as you need with islice(counter(), 10) (from itertools) to take the first 10. In FastAPI, this pattern appears in streaming responses where you yield chunks of a large file or database result set without loading everything into memory.items = list(my_generator())) or call the generator function again to create a fresh generator object. Never store a generator in a module-level variable and share it across requests — it will be exhausted after the first request processes it.yield from — Delegating to Sub-generators
# yield from — delegate iteration to another iterable/generator
def flatten(nested):
"""Flatten a nested list of any depth."""
for item in nested:
if isinstance(item, list):
yield from flatten(item) # recursively delegate
else:
yield item
result = list(flatten([1, [2, [3, 4]], [5, 6]]))
# [1, 2, 3, 4, 5, 6]
# yield from with multiple generators
def combined_data():
yield from [1, 2, 3] # yield from a list
yield from range(4, 7) # yield from a range
yield from (x**2 for x in [7, 8, 9]) # yield from a generator expr
list(combined_data()) # [1, 2, 3, 4, 5, 6, 49, 64, 81]
# ── FastAPI streaming use case ─────────────────────────────────────────────────
def stream_csv_rows(rows: list[dict], chunk_size: int = 100):
"""Yield CSV rows in chunks for streaming response."""
import csv, io
# Yield header
header_buf = io.StringIO()
writer = csv.DictWriter(header_buf, fieldnames=rows[0].keys())
writer.writeheader()
yield header_buf.getvalue()
# Yield data in chunks
for i in range(0, len(rows), chunk_size):
chunk = rows[i:i + chunk_size]
buf = io.StringIO()
writer = csv.DictWriter(buf, fieldnames=rows[0].keys())
writer.writerows(chunk)
yield buf.getvalue()
# FastAPI endpoint
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
app = FastAPI()
@app.get("/export/csv")
async def export_csv():
rows = get_all_users() # large list from DB
return StreamingResponse(
stream_csv_rows(rows),
media_type="text/csv",
headers={"Content-Disposition": 'attachment; filename="users.csv"'}
)
Generator Send — Two-Way Communication
# Generators can receive values via send() — makes them co-routines
def accumulator():
"""Accumulate values sent to the generator."""
total = 0
while True:
value = yield total # yield the running total, receive next value
if value is None:
break
total += value
gen = accumulator()
next(gen) # must prime the generator (advance to first yield)
gen.send(10) # sends 10, returns 10
gen.send(20) # sends 20, returns 30
gen.send(5) # sends 5, returns 35
gen.close() # GeneratorExit raised inside the generator
# Note: async/await (Chapter 11) is built on this generator send mechanism
# async def and await are syntactic sugar over generators with send()
Common Mistakes
Mistake 1 — Calling a generator function and expecting a value
❌ Wrong — calling the function just creates the generator object:
def gen_numbers():
yield 1
yield 2
result = gen_numbers() # NOT [1, 2] — it's a generator object!
print(result) # <generator object gen_numbers at 0x...>
✅ Correct — iterate or convert explicitly:
result = list(gen_numbers()) # [1, 2] ✓
for n in gen_numbers(): ... # iterate ✓
Mistake 2 — Using a generator after exhaustion
❌ Wrong — reusing an exhausted generator:
numbers = (x for x in range(5))
total = sum(numbers) # exhausts the generator
count = sum(1 for _ in numbers) # 0 — generator already exhausted!
✅ Correct — recreate the generator or convert to list:
numbers = list(range(5)) # list can be iterated multiple times ✓
total = sum(numbers)
count = len(numbers)
Mistake 3 — Returning a value from a generator confusing it with yield
❌ Wrong — thinking return gives a value to the caller:
def gen():
yield 1
return 42 # this does NOT add 42 to the generator output!
# it raises StopIteration(42) — value goes to the exception
list(gen()) # [1] — 42 is NOT in the output
✅ Correct — use yield to produce values, return to stop the generator:
def gen():
yield 1
yield 42 # ✓ adds 42 to the sequence
list(gen()) # [1, 42]
Quick Reference
| Pattern | Code |
|---|---|
| Generator function | def gen(): yield value |
| Create generator | g = gen() — no code runs yet |
| Get next value | next(g) |
| Iterate all | for v in gen(): ... |
| Collect all | list(gen()) |
| Delegate to iterable | yield from other_iterable |
| Stop generator | return (raises StopIteration) |
| Infinite generator | while True: yield val; val += 1 |