Generators are often taught as one-way streets: you call next(), they yield a value, and you repeat until they're exhausted. But there's a second, far more powerful mode of operation that many Python developers never fully explore. The send() method lets you push values into a generator at the exact point where it's paused, transforming a simple data producer into a two-way communication channel. This single method — introduced in Python 2.5 — laid the groundwork for coroutines, the with statement's @contextmanager decorator, and ultimately the entire async/await ecosystem.
This article traces send() from the PEP that created it through its mechanics, its practical applications, its role in yield from delegation, and the recent Python 3.13 change that gave its sibling method close() new powers. Every code example runs, every PEP reference is real, and nothing here is surface-level.
Before send(): Generators Were Read-Only
To understand why send() matters, you need to understand what generators couldn't do before it existed.
PEP 255, authored by Neil Schemenauer, Tim Peters, and Magnus Lie Hetland, introduced generators to Python in version 2.2. The PEP's motivation was elegant: producer functions that maintain state between calls were painful to write with callbacks, and the existing alternatives — threads, Stackless Python, or manually managing state in objects — were either heavy, non-portable, or awkward.
Generators solved this by letting a function yield a value and suspend its execution, preserving all local variables on the stack frame until the next next() call resumed it. But in PEP 255's design, yield was strictly a statement. It sent values out of the generator. There was no mechanism to send values back in.
As the "What's New in Python 2.5" documentation put it plainly: generators introduced in Python 2.3 only produced output. Once a generator's code was invoked to create an iterator, there was no way to pass any new information into the function when its execution resumed.
Developers who needed to pass data into a running generator had to resort to workarounds: mutating a shared global variable, passing in a mutable container that the caller would modify between next() calls, or redesigning the generator as a class with state attributes. All of these were fragile and obscured the control flow.
PEP 342: The Birth of send()
On May 10, 2005, Guido van Rossum and Phillip J. Eby authored PEP 342, titled "Coroutines via Enhanced Generators." This PEP is the single most important document in the history of Python's generator protocol, and send() is its centerpiece.
The PEP's motivation section identifies the core problem: generators were almost coroutines, but not quite. A true coroutine can both produce and consume values, pausing and resuming at multiple points. Python's generators could pause and produce, but they couldn't consume. PEP 342 set out to close that gap with a minimal set of changes.
The PEP proposed five enhancements, and the first two are directly about send():
First, yield was redefined from a statement to an expression. This meant yield now had a return value — the value that send() would inject. Second, a new send(value) method was added to generator-iterator objects. Calling send(value) resumes the generator and makes value the result of the yield expression where the generator is currently paused.
The remaining three enhancements — throw() for injecting exceptions, close() for graceful shutdown via GeneratorExit, and allowing yield inside try/finally blocks — completed the coroutine toolkit. But send() was the fundamental shift that made generators bidirectional.
PEP 342 was implemented by Phillip J. Eby and shipped with Python 2.5. The PEP also incorporated ideas from two earlier proposals: PEP 288 by Raymond Hettinger (which proposed generator attributes and exceptions) and PEP 325 by Samuele Pedroni (which proposed resource-release support for generators). PEP 342 superseded both.
How send() Works: Mechanics in Detail
The mechanics of send() are precise and sometimes counterintuitive. Here's the formal behavior:
generator.send(value) resumes the generator and "sends" value into it. That value becomes the result of the yield expression at which the generator is currently suspended. The method returns the next value that the generator yields, or raises StopIteration if the generator exits.
Calling send(None) is exactly equivalent to calling next(generator).
There's one critical constraint: you cannot send a non-None value to a just-started generator. Because the generator hasn't executed any code yet, there's no yield expression waiting to receive a value. Attempting this raises a TypeError:
def echo():
while True:
received = yield
print(f"Got: {received}")
gen = echo()
# This raises TypeError:
# can't send non-None value to a just-started generator
gen.send("hello")
You must first "prime" the generator by calling next(gen) or gen.send(None) to advance it to the first yield:
gen = echo()
next(gen) # Prime: advances to the first yield
gen.send("hello") # Got: hello
gen.send("world") # Got: world
This priming requirement is so common that many developers write a decorator to handle it automatically:
def primed(gen_func):
def wrapper(*args, **kwargs):
gen = gen_func(*args, **kwargs)
next(gen)
return gen
return wrapper
@primed
def echo():
while True:
received = yield
print(f"Got: {received}")
gen = echo() # Already primed
gen.send("hello") # Got: hello --- no need for next() first
The @primed decorator pattern is a lightweight, reusable way to eliminate the manual priming step across your codebase. Define it once in a utilities module and apply it to every coroutine-style generator.
yield as Expression: The Key Insight
The conceptual leap that makes send() possible is understanding yield as an expression with a return value, not merely a statement that pauses execution.
When a generator reaches a yield expression, two things happen in sequence. First, the value to the right of yield (if any) is sent out to the caller as the return value of next() or send(). Second, the generator suspends. When execution resumes (via next() or send()), the yield expression evaluates to whatever value was passed into send(), or None if next() was used.
This means a single yield expression serves as both an output port and an input port:
def accumulator():
total = 0
while True:
value = yield total # yields total OUT, receives value IN
total += value
acc = accumulator()
print(next(acc)) # 0 (initial total, yield sends it out)
print(acc.send(10)) # 10 (total is now 0 + 10)
print(acc.send(20)) # 30 (total is now 10 + 20)
print(acc.send(5)) # 35 (total is now 30 + 5)
Follow the flow carefully. next(acc) runs the generator until yield total, which yields 0. The generator pauses. Then acc.send(10) resumes execution: the yield total expression evaluates to 10, which is assigned to value. The loop continues, total becomes 10, and the generator hits yield total again, yielding 10 to the caller. Each send() both delivers a value in and retrieves a value out.
"In effect, a yield-expression is like an inverted function call; the argument to yield is in fact returned (yielded) from the currently executing function, and the return value of yield is the argument passed in via send()." — PEP 342
Parenthesization Rules
PEP 342 established specific rules about when yield expressions need parentheses. A yield expression must always be parenthesized except when it occurs as the sole top-level expression on the right-hand side of an assignment. So these are all legal:
x = yield 42 # top-level on right side of assignment
x = yield # same, without a value
x = 12 + (yield 42) # parenthesized when part of a larger expression
x = 12 + (yield) # same
foo(yield 42) # parenthesized when used as an argument
foo(yield) # same
But these are illegal:
x = 12 + yield 42 # SyntaxError: needs parentheses
x = 12 + yield # SyntaxError
foo(yield 42, 12) # SyntaxError
In practice, wrapping yield in parentheses whenever you're capturing its return value is the safest habit. As the Python 2.5 documentation advised: always put parentheses around a yield expression when you're doing something with the returned value.
Pattern 1: Coroutine-Style Data Sinks
The most natural use of send() is to create data sinks — generators that consume values rather than produce them. This inverts the typical generator pattern:
def running_average():
total = 0.0
count = 0
average = None
while True:
value = yield average
total += value
count += 1
average = total / count
avg = running_average()
next(avg) # Prime: returns None (initial average)
print(avg.send(10)) # 10.0
print(avg.send(20)) # 15.0
print(avg.send(30)) # 20.0
print(avg.send(5)) # 16.25
This is cleaner than the class-based equivalent because the state (total, count, average) is implicit in the generator's local variables rather than stored as instance attributes. The control flow reads top-to-bottom, and there's no __init__ / method separation.
Pattern 2: The State Machine
send() is particularly powerful for implementing state machines where external input determines transitions:
def traffic_light():
state = "red"
while True:
command = yield state
if state == "red" and command == "next":
state = "green"
elif state == "green" and command == "next":
state = "yellow"
elif state == "yellow" and command == "next":
state = "red"
elif command == "emergency":
state = "red"
light = traffic_light()
print(next(light)) # red
print(light.send("next")) # green
print(light.send("next")) # yellow
print(light.send("next")) # red
print(light.send("next")) # green
print(light.send("emergency")) # red
Each send() call advances the state machine by one step, injecting the input event and receiving the resulting state. The entire state machine logic lives in a single function with readable, sequential control flow.
Pattern 3: The @contextmanager Connection
One of the most consequential uses of send() (and its companion methods throw() and close()) is the @contextmanager decorator in contextlib. PEP 343 — "The 'with' Statement" — was designed in tandem with PEP 342, and the @contextmanager decorator depends directly on generator send(), throw(), and close() semantics.
Here's how it works under the hood. When you write:
from contextlib import contextmanager
@contextmanager
def managed_resource():
print("Acquiring resource")
resource = acquire()
try:
yield resource
finally:
print("Releasing resource")
release(resource)
The @contextmanager decorator wraps this generator in a class that implements the context manager protocol. When the with block begins, it calls next() on the generator to advance to the yield, which produces the resource. When the with block ends normally, it calls next() again (effectively send(None)) to resume the generator into the finally block. If the with block raises an exception, it calls throw() to inject that exception at the yield point, allowing the generator's try/except/finally to handle cleanup.
Without PEP 342's send() and throw(), the @contextmanager pattern — one of Python's most elegant abstractions — simply wouldn't work. PEP 342's specification notes explicitly that allowing yield inside try/finally blocks was necessary to implement the with statement described by PEP 343.
PEP 380: yield from and Transparent send() Delegation
In Python 3.3, PEP 380 — "Syntax for Delegating to a Subgenerator," authored by Gregory Ewing and officially accepted by Guido van Rossum on June 26, 2011 — introduced the yield from syntax. This is where send() becomes essential infrastructure rather than just a convenience.
The problem yield from solves is delegation. If you have a generator that needs to delegate to a sub-generator, and you're only yielding values out, a simple loop works:
def delegator():
for value in sub_generator():
yield value
But if the caller uses send(), throw(), or close(), that simple loop breaks. The sent values go to the delegating generator, not the sub-generator. You'd need to manually forward every send() and throw() call, handle StopIteration with return values, propagate GeneratorExit, and get all the edge cases right. PEP 380 itself acknowledges this directly: the necessary code is very complicated, and it is tricky to handle all the corner cases correctly.
yield from handles all of this transparently:
def sub_gen():
total = 0
while True:
value = yield total
if value is None:
return total
total += value
def delegator():
result = yield from sub_gen()
print(f"Sub-generator returned: {result}")
gen = delegator()
print(next(gen)) # 0
print(gen.send(10)) # 10
print(gen.send(20)) # 30
gen.send(None) # Sub-generator returned: 30
When the caller calls gen.send(10), yield from transparently forwards the 10 into sub_gen() via its send() method. When sub_gen returns a value, that becomes the value of the yield from expression in delegator. This transparent forwarding of send(), throw(), and close() is the real power of yield from — and it's only meaningful because PEP 342 gave generators send() in the first place.
The async/await Lineage
PEP 342 made generators "usable as simple coroutines" (its own words). This planted the seed that grew into Python's entire asynchronous programming model.
Before async/await existed, frameworks like Twisted, Tornado, and asyncio's precursor (Tulip) used generator-based coroutines extensively. A coroutine would yield a Future or similar object, and the event loop scheduler would send() the result back into the generator when the operation completed:
# Pre-async/await coroutine pattern (simplified)
@coroutine
def fetch_data(url):
response = yield http_request(url) # yield a Future, get result via send()
data = yield parse_response(response) # same pattern
return data
PEP 492, introduced in Python 3.5, formalized this pattern with dedicated async def / await syntax. Under the hood, native coroutines still use the same generator-based suspension mechanism, and the send() method remains part of the coroutine object's API. When you await something in an async function, the event loop eventually calls send() on the coroutine to deliver the result.
PEP 525 (Python 3.6) extended this further with asynchronous generators, which support an asend() method — the async equivalent of send(). The lineage is direct and unbroken: send() in PEP 342 enabled generator-based coroutines, which enabled yield from-based coroutines, which inspired native async/await coroutines, which led to async generators with asend().
Python 3.13: close() Gets a Return Value
For nearly two decades, close() was a one-way operation: it threw GeneratorExit into the generator and expected it to shut down. Any return value was silently swallowed.
Python 3.13 (released October 7, 2024) changed this. The Python 3.13 documentation for generator expressions now states: if a generator returns a value upon being closed, the value is returned by close().
This is a subtle but significant enhancement. Consider a generator that accumulates data and should produce a summary when finished:
# Python 3.13+
def collector():
items = []
try:
while True:
item = yield
items.append(item)
except GeneratorExit:
return len(items)
gen = collector()
next(gen)
gen.send("a")
gen.send("b")
gen.send("c")
result = gen.close() # Returns 3 in Python 3.13+
Before 3.13, there was no clean way to extract a final value from a generator you were closing. You'd have to use a sentinel value, wrap the generator in a class, or use a shared mutable container. Now close() naturally returns whatever the generator returns when handling GeneratorExit. This makes send()-based coroutine patterns cleaner, because the generator can accumulate state via send() calls and then produce a final result when closed.
Common Pitfalls
Working with send() comes with several recurring mistakes that trip up developers.
This is the most common error. Every generator that uses send() must be advanced to its first yield before you can send non-None values. If you see TypeError: can't send non-None value to a just-started generator, this is why.
Confusing the timing of yield. Remember that yield is a suspension point. The value to the right of yield is sent out before the generator pauses. The value returned by yield (from send()) is received after the generator resumes. Many bugs come from confusing which side of this pause a particular piece of code executes on.
Ignoring None from next(). When you call next() on a generator that uses send(), the yield expression returns None. If your generator doesn't handle the None case, it may produce incorrect results or raise an exception:
def doubler():
while True:
value = yield
yield value * 2 # What if value is None?
gen = doubler()
next(gen) # Advance to first yield
gen.send(5) # Returns 10
next(gen) # Advances to next "value = yield", then value is None
# next(gen) again would try None * 2 -> TypeError
Mixing iteration with send(). Using a for loop over a generator that expects send() values will only ever send None (since for calls next() internally). If your generator depends on receiving non-None values, you must use send() explicitly, not a for loop.
When to Use send() (and When Not To)
send() is the right tool when you need genuine two-way communication with a suspended function: data sinks, interactive state machines, coroutine schedulers, and protocol implementations.
It is not the right tool for simple iteration or data transformation. If your generator only needs to produce values (not receive them), send() adds complexity without benefit. Similarly, if you find yourself building elaborate send()-based coroutine systems, consider whether async/await would express the same logic more clearly. The async syntax exists precisely because generator-based coroutines using send() were powerful but difficult to read at scale.
The Timeline
- Python 2.2 (2001): PEP 255 introduces generators and the
yieldstatement. Generators are one-directional: they produce values but cannot receive them. - Python 2.5 (2006): PEP 342 introduces
send(),throw(),close(), and redefinesyieldas an expression. Generators become usable as coroutines. - Python 3.3 (2012): PEP 380 introduces
yield from, which transparently delegatessend(),throw(), andclose()to sub-generators. - Python 3.5 (2015): PEP 492 introduces
async def/await, formalizing the coroutine pattern that generatorsend()pioneered. - Python 3.6 (2016): PEP 525 introduces asynchronous generators with
asend(), the async counterpart tosend(). - Python 3.13 (2024):
close()now returns the generator's return value, completing the communication loop thatsend()started.
Conclusion
The send() method is one of those features that reveals Python's depth. On the surface, it's a simple API: resume a generator and inject a value. Underneath, it represents a fundamental shift from generators-as-iterators to generators-as-coroutines, a shift that enabled @contextmanager, yield from, and the entire async programming paradigm.
Understanding send() means understanding that yield is a two-way portal, not a one-way valve. Values flow out through yield, values flow in through send(), and the generator's local state persists across these exchanges. That mental model — a function that can pause, communicate, and resume — is the beating heart of concurrent Python, whether you're writing a simple data accumulator or an asynchronous web server.
What started as a PEP to make generators "usable as simple coroutines" turned out to be one of the most consequential API additions in Python's history. And it all began with a single method: send().