Most Python developers learn generators as one-way pipes: the function yields a value, the caller receives it, repeat until exhaustion. That mental model is incomplete. Since Python 2.5, generators have had a second channel — a way to push data back in while execution is paused. That channel is .send(), and understanding it changes how you think about what a generator actually is.
The confusion around .send() is almost always the same confusion at its root: people treat yield as a statement that pauses and emits, full stop. But yield in Python is an expression — it both emits a value outward and resolves to a value inward, and .send() is what controls that inward resolution. This article explains exactly how that works, where it came from, what goes wrong when you skip the priming step, and where the pattern still belongs today.
Generators Were Always One-Way — Until Python 2.5
Python introduced generators in version 2.2 via PEP 255. The design was clean and deliberate: a generator function suspends at a yield statement, returns the yielded value to the caller, and resumes from that exact point when next() is called again. All local state — variable bindings, the instruction pointer, the internal stack — is preserved across the suspension. This made generators excellent lazy iterators: you could produce sequences of values on demand without building them all in memory at once.
What generators could not do was receive input after they started. Every call to next() resumed execution, but brought nothing with it. The generator could only observe its own internal state and whatever was captured in its closure. If you wanted it to behave differently based on external input, you had to encode that into the arguments passed at construction time. The communication was strictly one direction: generator to caller.
That limitation became a real problem as developers tried to use generators for more sophisticated tasks — simulations, event-driven pipelines, cooperative multitasking. The PEP 342 authors, Guido van Rossum and Phillip J. Eby, diagnosed the gap precisely. As stated in PEP 342:
PEP 342 (Guido van Rossum & Phillip J. Eby) put it directly: generators could pause and produce values, but had no way to accept values or exceptions when execution resumed — making them nearly, but not quite, coroutines.
The fix required two linked changes: making yield an expression rather than a pure statement, and adding a .send() method to the generator object. Both shipped together in Python 2.5 in 2006. The cumulative effect, as PEP 342 described it, was to turn generators from one-way producers of information into both producers and consumers — and in doing so, to turn them into proper coroutines.
How yield Became an Expression
Before Python 2.5, yield was a statement. You wrote yield some_value and that was the whole story: the value was emitted and execution paused. The statement had no return value because there was nothing to return — no channel existed through which the caller could inject anything.
PEP 342 changed yield into an expression. This is a meaningful distinction. An expression in Python produces a value. Statements do not. When yield became an expression, the syntax received = yield emitted_value became legal and meaningful: emitted_value is sent out to the caller, and whatever the caller injects back becomes the value that yield evaluates to — stored in received.
The Python 2.5 release notes for PEP 342 capture the exact rule: the value passed to .send() "becomes the result of the current yield expression." If no value is sent — if next() is called instead — the yield expression evaluates to None. That asymmetry is important and is the source of most confusion about .send().
The PEP 342 authors recommend always wrapping yield expressions in parentheses when using the result: received = (yield emitted_value). The parens are not always syntactically required — received = yield emitted_value is valid at the top level of an assignment — but they make the expression boundary explicit and prevent subtle parsing errors when the yield result is used inside a larger expression.
The practical consequence of yield-as-expression is that a single yield point in your generator now does double duty simultaneously. It emits a value on the way out and receives a value on the way in. Those two actions happen at the same suspension point, which is why the timing of .send() is so important: you can only send into a yield that is currently waiting. You cannot send into thin air.
The Mechanics of .send() Step by Step
When you call gen.send(value), the following sequence happens in order inside CPython:
First, the generator's execution is resumed from the point where it is currently suspended — at a yield expression. Second, the value argument you passed becomes the result of that yield expression inside the generator's frame. Third, execution continues forward inside the generator until the next yield expression is reached. Fourth, the value following that next yield is returned to the caller as the return value of .send(). If no further yield is reached and the generator function returns, StopIteration is raised.
Here is a minimal example that makes each of these steps visible:
def two_way():
print("Generator started")
received = (yield "first yield") # emits "first yield", waits
print(f"Received: {received}")
received2 = (yield "second yield") # emits "second yield", waits
print(f"Received again: {received2}")
# no more yields — StopIteration will be raised next
gen = two_way()
# Prime: advance to the first yield
out1 = next(gen) # prints "Generator started"
print(out1) # prints "first yield"
# Send a value in; generator resumes, "Hello" is the value of the yield expression
out2 = gen.send("Hello") # prints "Received: Hello"
print(out2) # prints "second yield"
# Send again
try:
gen.send("World") # prints "Received again: World"
except StopIteration:
print("Generator exhausted")
Trace through this carefully. The first next(gen) call advances execution to the first yield "first yield". At that point the generator is suspended: it has emitted "first yield" and is waiting at the yield expression. When gen.send("Hello") is called, "Hello" is injected as the value of that waiting yield expression — so received gets the string "Hello". Execution continues until the second yield "second yield", which emits "second yield" outward to the caller (the return value of .send()). The generator is now suspended again at the second yield. The final .send("World") injects "World" into that expression, prints the message, then the function ends with no further yields, raising StopIteration.
The return value of .send() is always the value produced by the next yield the generator hits after resuming — not the value you sent in. The value you send in goes to the current yield expression. Keeping those two directions distinct in your mental model is the key to understanding .send() without confusion.
.send() is a method on generator objects specifically — objects produced by calling a generator function. It is not available on arbitrary iterables. A list, a range, a custom class implementing __iter__, or even a generator expression like (x*2 for x in range(5)) does not expose .send(). Generator expressions produce generator objects with a .send() method only because Python compiles them into an internal generator function, but since there is no authored yield expression to inject into, sending a non-None value raises TypeError just as it would on any unprimed generator. If you need two-way communication, the object must be an explicitly authored generator function with a yield expression designed to capture the sent value.
Why You Must Prime a Generator First
The rule is firm: you cannot call .send(non_None_value) on a generator that has not yet been advanced to its first yield. Attempting to do so raises TypeError: can't send non-None value to a just-started generator.
The reason is structural. When a generator object is first created, its code has not executed at all. There is no yield expression currently suspended and waiting to receive anything. The .send() mechanism works by injecting a value into the current yield expression — but if the generator has never run, there is no current yield expression. Sending a non-None value into that void is an error.
def my_gen():
value = (yield "ready")
yield value * 2
gen = my_gen()
# This raises TypeError — generator hasn't started yet
# gen.send(10) # TypeError: can't send non-None value to a just-started generator
# Correct approach: prime first
first = next(gen) # advances to first yield, returns "ready"
print(first) # "ready"
result = gen.send(10) # injects 10, generator yields 20
print(result) # 20
Priming means advancing the generator to its first yield so that there is a suspended yield expression ready to receive input. The two standard ways to prime are next(gen) and gen.send(None). They are precisely equivalent: as confirmed by the CPython source and PEP 342, __next__() is implemented as send(None). Sending None is allowed on a fresh generator because None is the defined neutral value — it becomes the value of the yield expression, which is fine as long as your generator checks for it (or you use next() which implies the same thing).
A common pattern for coroutine-style generators is to wrap the generator function in a decorator that auto-primes it — calling next() immediately after construction so callers never have to think about the priming step. PEP 342 itself shows this pattern using a @consumer decorator. It is clean, but make sure any generator you wrap this way is actually designed to receive values from the start, or the first None injected by the priming call will cause a silent bug if the generator's code tries to use the value of the yield expression without checking for None.
gen.send(42) — before calling next(). What happens?What happens when you .send() into an already-exhausted generator?
The article has described what happens when a .send() call reaches the end of a generator: StopIteration is raised and the generator is closed. But readers often run into a different, subtler problem — calling .send() again on a generator that is already in the GEN_CLOSED state. The error is the same exception class but a different situation entirely:
def one_shot():
yield (yield "first")
gen = one_shot()
next(gen) # prime → yields "first", suspends at outer yield
gen.send("second") # injects "second" into inner yield; outer yield emits "second"
# returns "second" — generator is still alive, suspended at outer yield
try:
gen.send("third") # injects "third" into outer yield; generator returns (no more yields)
except StopIteration:
print("Generator exhausted")
gen.send("fourth") # StopIteration raised immediately — generator already closed
# NOT from running out of yields — the frame is gone
Both the third and fourth calls raise StopIteration, but they mean different things. The third call — gen.send("third") — is the normal end-of-iteration signal: the generator ran out of yield expressions and returned. Note that the second call, gen.send("second"), does not raise StopIteration — it returns "second" and leaves the generator still suspended, because the nested yield (yield "first") has two yield points, not one. The fourth call raises StopIteration immediately because the generator is already in the GEN_CLOSED state — there is no frame left to resume. A well-written driver loop always catches StopIteration on the call that exhausts the generator and stops there — if you see it immediately on what feels like the wrong call, use inspect.getgeneratorstate() to confirm the generator was already closed before you called.
Defensive driver loops should call inspect.getgeneratorstate(gen) before .send() in debug builds, or wrap every .send() in try/except StopIteration and check whether the generator is now closed before deciding whether to restart it.
next() vs .send() vs .send(None)
These three are closely related but not identical in intent, even when they produce the same mechanical outcome.
| Call | Value injected into yield expression | Allowed on fresh generator? | Typical use |
|---|---|---|---|
next(gen) |
None |
Yes | Advancing a generator used as an iterator; priming a coroutine-style generator |
gen.send(None) |
None |
Yes | Equivalent to next(gen); used explicitly when you want to signal "no input this cycle" |
gen.send(value) |
value |
No — raises TypeError | Passing data into a suspended generator; driving a coroutine with meaningful input |
From CPython's perspective, next(gen) and gen.send(None) are the same operation. The distinction is one of intent and readability. Use next() when you are treating the generator as a plain iterator and do not care about two-way communication. Use .send(None) when you are explicitly driving a coroutine-style generator and want to be clear that you are choosing not to send a value this cycle. Use .send(value) when you have actual data to inject.
The Other Two: .throw() and .close()
PEP 342 added three methods to generators, not one. .send() gets most of the attention, but .throw() and .close() complete the picture — and understanding them clarifies what .send() is actually doing at the frame level.
.throw() — injecting an exception at the yield point
gen.throw(exc) resumes the generator at the currently suspended yield expression, but instead of supplying a value, it raises the specified exception there. Pass an exception instance directly — the older two-argument form gen.throw(ExcType, value) is deprecated since Python 3.12 and will be removed in a future version. The generator can catch the thrown exception with a normal try/except block and continue running, or let it propagate — in which case the exception bubbles out to the caller of .throw().
def resilient():
while True:
try:
value = (yield "waiting")
print(f"Got: {value}")
except ValueError as e:
print(f"Caught inside generator: {e}")
# generator continues — does not stop
gen = resilient()
next(gen) # prime
gen.send("hello") # Got: hello
gen.throw(ValueError("bad input")) # Caught inside generator: bad input
gen.send("back to normal") # Got: back to normal
The generator catches ValueError internally, handles it, loops back to the yield, and resumes normally. If the generator does not catch the thrown exception, the exception propagates out to the caller and the generator is exhausted. This is the mechanism that asyncio uses internally to cancel tasks — it throws CancelledError into a coroutine's suspended yield point.
.close() — shutting a generator down cleanly
gen.close() throws GeneratorExit into the generator at the current yield point. This gives the generator a chance to run any finally blocks and release resources before stopping. If the generator catches GeneratorExit and then yields again, Python raises RuntimeError — a generator that catches GeneratorExit must either return or re-raise it.
def with_cleanup():
try:
while True:
value = (yield "running")
print(f"Processing: {value}")
except GeneratorExit:
print("Generator is shutting down — cleaning up")
# return here (or just fall off the end) — do NOT yield again
gen = with_cleanup()
next(gen)
gen.send("first")
gen.close() # prints: Generator is shutting down — cleaning up
# gen is now exhausted — any further .send() raises StopIteration
Think of the three PEP 342 methods as three ways to resume a suspended generator: .send(value) resumes with a value, .throw(exc) resumes with an exception, and .close() resumes with GeneratorExit. All three inject something into the same waiting yield expression — the difference is what they inject.
The send/yield Handshake: A Visual
The interaction between caller and generator through .send() is a back-and-forth handshake. Each party is suspended while the other runs. The diagram below traces the execution path through a simple two-cycle interaction.
Real Patterns: Accumulators, State Machines, Pipelines
The value of .send() becomes concrete when you look at patterns where two-way communication simplifies otherwise awkward code.
Running Accumulator
A generator that maintains a running total is a classic illustration. Without .send(), you would need a class with mutable state or a closure with a nonlocal variable. With .send(), the generator's own frame is the state container:
def accumulator():
total = 0
while True:
value = (yield total) # emit current total; receive next addend
if value is None:
break
total += value
acc = accumulator()
next(acc) # prime: advances to first yield, emits 0
print(acc.send(5)) # total = 5, emits 5
print(acc.send(10)) # total = 15, emits 15
print(acc.send(3)) # total = 18, emits 18
Each call to .send() adds a number to the running total and immediately returns the updated total. The generator carries the state between calls without any external variable. This is the pattern that PEP 342 used in its own accumulator example, cited in the Python documentation.
TypeError on line 5. Which line contains the actual mistake?1 def accumulator():
2 total = 0
3 while True:
4 value = (yield total)
5 total += value
6
7 acc = accumulator()
8 acc.send(5) # <-- called without priming first
Resettable State Machine
State machines are another natural fit. The generator's local variables are the machine's state; .send() is the event input; the yielded value is the output or the new state label:
def traffic_light():
states = {"red": "green", "green": "yellow", "yellow": "red"}
current = "red"
while True:
override = (yield current)
if override and override in states:
current = override # forced transition
else:
current = states[current] # automatic cycle
light = traffic_light()
print(next(light)) # "red" (auto-advance)
print(light.send(None)) # "green" (auto-advance)
print(light.send("red")) # "red" (forced override)
print(light.send(None)) # "green" (auto-advance from red)
Processing Pipeline
PEP 342 specifically motivated .send() with pipeline and consumer patterns. Multiple coroutine-style generators can be chained so that each one receives data from upstream and passes processed results downstream. The author notes show a thumbnail pipeline where image frames are sent into a generator that pages them, all using (yield) as the intake point:
def printer():
"""A simple sink: receives values and prints them."""
while True:
item = (yield)
print(f"Processed: {item}")
def uppercaser(downstream):
"""Transforms input and forwards to downstream consumer."""
while True:
item = (yield)
downstream.send(item.upper())
# Wire up the pipeline
sink = printer()
next(sink)
pipe = uppercaser(sink)
next(pipe)
pipe.send("hello") # prints: Processed: HELLO
pipe.send("world") # prints: Processed: WORLD
Each stage in the pipeline uses (yield) with no emitted value — it only receives. The yield expression here evaluates to whatever was sent in; nothing is emitted outward (the caller's .send() would return None). This is a valid and common pattern for pure consumer generators.
gen.send("hello"), what does the return value of that call represent?The Auto-Prime Decorator
Any generator designed to receive values with .send() must be primed before use. When you have many such generators in a codebase, the priming call scattered at every construction site is noise. PEP 342 itself demonstrates a @consumer decorator that handles priming automatically:
import functools
def consumer(func):
"""Decorator that auto-primes a generator on construction."""
@functools.wraps(func)
def wrapper(*args, **kwargs):
gen = func(*args, **kwargs)
next(gen) # advance to the first yield
return gen
return wrapper
@consumer
def logger(prefix):
while True:
message = (yield)
print(f"[{prefix}] {message}")
# No manual priming needed — the decorator handles it
log = logger("INFO")
log.send("Server started") # [INFO] Server started
log.send("Request received") # [INFO] Request received
The decorator replaces the generator function with a wrapper that calls next() immediately after construction and returns the already-primed generator. Callers see a clean API: construct, then send. One caution applies: only use this decorator on generators that are genuinely designed to receive input from the first iteration. If the generator yields a meaningful value at the first yield — one the caller is supposed to see — the decorator silently discards it.
yield from and .send() Delegation
PEP 380 (Python 3.3) introduced yield from, which delegates to a subgenerator. A key part of that delegation is that .send() calls on the outer generator are automatically forwarded to the inner one. You do not need to manually wire them:
def inner():
x = (yield "inner waiting")
print(f"Inner received: {x}")
y = (yield "inner waiting again")
print(f"Inner received: {y}")
def outer():
print("Outer: delegating to inner")
yield from inner()
print("Outer: inner exhausted, continuing")
gen = outer()
print(next(gen)) # Outer: delegating to inner
# inner waiting
print(gen.send("alpha")) # Inner received: alpha
# inner waiting again
try:
gen.send("beta") # Inner received: beta
# Outer: inner exhausted, continuing
except StopIteration:
pass
The gen.send("alpha") call on the outer generator is transparently forwarded by yield from directly into the inner generator's suspended yield expression. The outer generator never sees the values — it simply acts as a transparent conduit. The same forwarding applies to .throw() and .close(). This is how Python's asyncio task machinery chains coroutines: await compiles to yield from at the bytecode level, so every await in an async function is a transparent .send() delegation chain all the way down to the event loop.
What a Generator's return Value Actually Does
The article has shown that when a generator runs out of yield expressions, StopIteration is raised. What it has not yet covered is that a return value statement inside a generator sets StopIteration.value — and that yield from captures this value and makes it available to the outer generator as the result of the entire delegation.
This is the missing link between .send(), subgenerator delegation, and bidirectional communication across multiple generator layers:
def inner_worker():
total = 0
while True:
value = (yield)
if value is None:
return total # return value becomes StopIteration.value
total += value
def outer():
# yield from captures the return value of inner_worker
# when inner_worker returns, result gets that value
result = yield from inner_worker()
yield f"Final total: {result}"
gen = outer()
next(gen) # prime — advances into inner_worker's first yield
gen.send(10)
gen.send(25)
gen.send(7)
# Sending None signals inner_worker to return its total
# yield from captures the return value and assigns it to `result`
# outer then hits its own yield
print(gen.send(None)) # Final total: 42
Three things happen here that are easy to miss. First, return total inside a generator does not immediately propagate as an unhandled exception — it sets StopIteration.value to total and exits the generator normally. Second, yield from specifically intercepts that StopIteration, extracts its .value, and makes it the result of the yield from expression on the left-hand side — so result = yield from inner_worker() gives outer the computed total without any extra plumbing. Third, if you were driving this without yield from — calling .send() on the inner generator directly — you would need to catch StopIteration yourself and read e.value to retrieve the return value:
# Driving inner_worker manually — no yield from
worker = inner_worker()
next(worker)
worker.send(10)
worker.send(25)
try:
worker.send(None) # triggers return inside generator
except StopIteration as e:
print(e.value) # 35 — the return value is on StopIteration.value
This pattern matters in practice whenever you write generator-based protocols or implement a custom scheduler. The return value of a subgenerator is how it communicates its final result back to its caller — and yield from is what makes that communication automatic rather than requiring manual exception handling at every delegation boundary.
Where .send() Fits Now That async/await Exists
Python 3.5 introduced native coroutines via PEP 492, with the async def and await syntax. One of PEP 492's explicit motivations was the confusion between generator-based coroutines and plain generators: they shared syntax, which made it difficult to tell at a glance whether a function was intended to be driven with next() or as a coroutine. Native coroutines removed that ambiguity by creating a distinct type with its own syntax.
PEP 492 (Yury Selivanov) noted that distinguishing coroutines from regular generators was a persistent source of confusion, particularly for developers newer to Python, given that both shared the same syntax.
As Luciano Ramalho explains in Fluent Python (O'Reilly), the .send() infrastructure arrived with PEP 342 in Python 2.5, which was when yield became a proper expression and generators gained the ability to function as coroutines. Native async/await coroutines build on the same underlying mechanism — coroutine objects still expose .send(), .throw(), and .close() — but the event loop calls those methods internally, and application code never calls .send() directly.
This means .send() on generators remains relevant in several specific situations today:
It is appropriate when you are writing a stateful generator that needs two-way communication but does not need to participate in an async event loop. The accumulator and state machine patterns above are examples where async/await would be unnecessary overhead. It is also appropriate when you are working at a low level with coroutine objects — writing a custom event loop, implementing a trampoline scheduler, or building a testing harness that drives coroutines directly. And it appears in the implementation of yield from (PEP 380, Python 3.3), which internally forwards .send() calls down to subgenerators through the delegation chain.
For ordinary asynchronous I/O and task coordination, async/await is the right tool. For synchronous stateful computation with bidirectional communication, .send() on a plain generator is still clean and precise.
Since Python 3.7 (via PEP 479), any StopIteration raised inside a generator — including one triggered accidentally by an unguarded next() call several frames deep — is converted to a RuntimeError instead of silently terminating the iteration. This change affects code that uses .send() in pipelines: if a downstream generator is exhausted and your pipeline does not handle it explicitly, you will now get a visible RuntimeError rather than a silent stop. Always wrap pipeline termination in explicit try/except StopIteration blocks or use return inside the generator to signal exhaustion.
Inspecting Generator State
When a .send() call misbehaves — wrong value comes back, unexpected TypeError, premature StopIteration — the first question is usually: where is this generator right now? Python exposes that through the inspect module and a few attributes on the generator object itself.
import inspect
def my_gen():
yield 1
yield 2
gen = my_gen()
print(inspect.getgeneratorstate(gen)) # GEN_CREATED — never started
next(gen)
print(inspect.getgeneratorstate(gen)) # GEN_SUSPENDED — at a yield
# exhaust it
list(gen)
print(inspect.getgeneratorstate(gen)) # GEN_CLOSED — no more yields
gen2 = my_gen()
gen2.close()
print(inspect.getgeneratorstate(gen2)) # GEN_CLOSED — explicitly closed
The four states are GEN_CREATED (constructed, never advanced — this is why .send(non_None) fails here), GEN_RUNNING (currently executing, only visible from inside the generator itself), GEN_SUSPENDED (paused at a yield, ready to receive .send()), and GEN_CLOSED (exhausted or closed).
The generator's current local variables are accessible via gen.gi_frame.f_locals when the state is GEN_SUSPENDED. Once the generator is closed, gi_frame is None. This is useful when debugging a long-running stateful generator to verify what the accumulated state looks like without disrupting execution:
def accumulator():
total = 0
while True:
value = (yield total)
if value is None:
break
total += value
acc = accumulator()
next(acc)
acc.send(10)
acc.send(25)
# Peek at internal state without consuming the generator
print(acc.gi_frame.f_locals) # {'total': 35, 'value': 25}
Accessing gi_frame.f_locals is useful for debugging but is considered an implementation detail of CPython. It is not guaranteed to be available or accurate in other Python implementations (PyPy, Jython, MicroPython). Do not rely on it in production logic — only use it in debugging and diagnostic tools.
Key Takeaways
yieldis an expression, not just a statement. Since Python 2.5 (PEP 342),yieldboth emits a value outward and evaluates to a value inward. The inward value is what.send()supplies..send()is specific to generator objects. It is not available on arbitrary iterables, generator expressions with no authoredyieldexpression, or custom classes implementing__iter__. Two-way communication requires an explicitly authored generator function with ayieldexpression designed to capture the sent value..send(value)resumes the generator and injectsvalueinto the currently suspended yield expression. The return value of.send()is the value produced by the next yield the generator reaches after resuming — not the value you sent in.- You must prime a generator before sending a non-None value. Call
next(gen)orgen.send(None)first. Calling.send(value)on a fresh generator raisesTypeError; calling it on an already-exhausted generator raisesStopIterationimmediately — a different error for a different situation. next(gen)andgen.send(None)are identical at the CPython level. Usenext()when treating a generator as an iterator; use.send(None)when explicitly driving a coroutine and signalling "no input this cycle."- PEP 342 added three methods, not one.
.send(value)resumes with a value,.throw(exc)resumes by raising an exception at the yield point, and.close()resumes by throwingGeneratorExit. All three inject something into the same waiting yield expression. - A generator's
return valuebecomesStopIteration.value.yield fromcaptures this automatically and assigns it as the result of the delegation expression. Withoutyield from, callers must catchStopIterationand reade.valuemanually. yield fromforwards.send()calls transparently. Any.send(),.throw(), or.close()call on an outer generator is automatically delegated to the inner subgenerator. This is the mechanism that makesawaitwork in async functions..send()is still the right tool for synchronous stateful generators. For async I/O, preferasync/await. For in-process two-way state machines, accumulators, and pipelines that do not need an event loop,.send()on a plain generator remains a clean, precise solution. Useinspect.getgeneratorstate()andgi_frame.f_localsto debug when needed.
The generator protocol in Python has always been richer than the basic for loop version of it suggests. .send() is the part that turns a generator from a one-directional sequence producer into something closer to a function that can be paused, handed a value, and then continued — a pattern that underpins everything from simple stateful counters to the coroutine machinery that powers Python's entire async ecosystem.
Frequently Asked Questions
What does generator.send() do in Python?
generator.send(value) resumes the generator from where it is suspended and injects value into the currently waiting yield expression. The yield expression inside the generator evaluates to that value. The return value of .send() is whatever the generator yields next after resuming. If the generator reaches its end without another yield, StopIteration is raised.
Why do you have to prime a generator before calling .send()?
When a generator object is first created, its code has not executed at all — there is no yield expression currently suspended and waiting to receive a value. Calling .send(non_None_value) on a fresh generator raises TypeError because there is no active yield expression to inject into. You must first call next(gen) or gen.send(None) to advance execution to the first yield, creating a suspended yield expression that can then receive input.
What is the difference between next() and gen.send(None)?
At the CPython level, next(gen) and gen.send(None) are identical operations — __next__() is implemented as send(None). Both inject None as the value of the current yield expression and advance the generator to the next yield. The distinction is one of intent: use next() when treating a generator as a plain iterator, and use .send(None) when explicitly driving a coroutine-style generator to signal that no input is being provided this cycle.
When should you use .send() instead of async/await?
Use .send() on a plain generator when you need synchronous, stateful two-way communication without an async event loop — for example, running accumulators, state machines, or in-process pipelines. Use async/await for asynchronous I/O and task coordination. Native coroutines (async def) build on the same underlying .send() mechanism, but the event loop calls .send() internally so application code never needs to call it directly.
What PEP introduced generator.send()?
PEP 342, authored by Guido van Rossum and Phillip J. Eby, introduced generator.send() in Python 2.5 (released 2006). The PEP also made yield an expression rather than a pure statement, and added .throw() and .close() to the generator API. These changes transformed Python generators from one-way value producers into proper coroutines capable of both producing and consuming values.
What does generator.throw() do?
gen.throw(exc) resumes the generator at the currently suspended yield expression but raises the specified exception there instead of supplying a value. Pass an exception instance — e.g. gen.throw(ValueError("bad input")). The older two-argument form gen.throw(ExcType, value) is deprecated since Python 3.12. The generator can catch the exception with a normal try/except block and continue running, or let it propagate — in which case the exception bubbles out to the caller of .throw(). This is the same mechanism asyncio uses internally to cancel tasks by throwing CancelledError into a suspended coroutine.
Does yield from forward .send() calls to the subgenerator?
Yes. When a generator uses yield from to delegate to a subgenerator, any .send(), .throw(), or .close() call on the outer generator is automatically forwarded to the inner one. The outer generator acts as a transparent conduit. This delegation is how the await keyword works in async functions — at the bytecode level, await compiles to yield from, so every await is a transparent .send() chain down to the event loop.
What does a generator's return statement do, and how does it relate to StopIteration?
A return value statement inside a generator sets StopIteration.value when the generator finishes. If you drive the generator manually with .send(), you must catch StopIteration and read e.value to retrieve it. If you use yield from, Python captures that StopIteration.value automatically and assigns it as the result of the yield from expression — so result = yield from subgen() gives the outer generator the subgenerator's return value without any manual exception handling.
What error do you get when you call .send() on an already-exhausted generator?
Calling .send() on a generator that is already in the GEN_CLOSED state raises StopIteration immediately, before any code runs. This looks identical to the StopIteration raised when a generator naturally runs out of yield expressions, but it means something different: the generator has no frame left to resume. The fix is to catch StopIteration on the call that exhausts the generator and stop sending — or check inspect.getgeneratorstate(gen) before sending to confirm the generator is in GEN_SUSPENDED.