If you have written any asynchronous Python in the last decade, you have used async def and await. These two keywords feel so natural now that it is easy to forget they did not always exist. Before Python 3.5, writing concurrent code meant wrestling with generator-based coroutines, decorators that masked intent, and a syntax that made it dangerously easy to confuse an asynchronous function with a regular generator.
PEP 492 changed all of that. Proposed on April 9, 2015, accepted by Guido van Rossum on May 5, and committed to CPython on May 11 — just six days later — it moved faster than almost any major language change in Python's history. The speed was not recklessness — it was the result of years of accumulated frustration with the old way of doing things, a well-designed proposal, and a BDFL who knew exactly what he wanted Python's concurrency model to look like.
This article traces the full story: the problem PEP 492 solved, the design decisions behind the syntax, the related PEPs that built on its foundation, and how to write modern async Python that takes full advantage of what it introduced.
The Problem: When Generators and Coroutines Shared a Disguise
To understand why PEP 492 mattered, you need to understand what came before it. Python's support for coroutines was built on top of generators — a decision that was clever but created lasting confusion.
PEP 342, "Coroutines via Enhanced Generators," was accepted in 2005 for Python 2.5. It turned generators from simple producers of values into two-way communication channels by adding the send(), throw(), and close() methods. A generator could now receive values and exceptions from the caller, which meant it could function as a primitive coroutine.
PEP 380, "Syntax for Delegating to a Subgenerator," arrived in Python 3.3 with the yield from expression. This allowed one generator to delegate to another, creating chains of coroutines that could pass values and exceptions up and down the call stack.
PEP 3156, authored by Guido van Rossum himself in 2012, then introduced the asyncio module and its event loop. Finally, Python had official infrastructure for running asynchronous I/O. But the coroutines that ran on this event loop were still generator functions, decorated with @asyncio.coroutine and using yield from to suspend execution.
Here is what an asynchronous database read looked like in 2014:
import asyncio
@asyncio.coroutine # decorator required — but easy to forget
def read_data(db): # looks like a regular function
data = yield from db.fetch('SELECT ...') # generator syntax
return data
# Is this a coroutine or a generator? Hard to tell at a glance.
# Remove the yield from and it silently becomes a normal function.
with or for asynchronously.import asyncio
async def read_data(db): # signature declares intent unambiguously
data = await db.fetch('SELECT ...') # suspension point is explicit
return data
# Always a coroutine — even with no await in the body.
# The type system prevents accidentally iterating it.
async def declares a coroutine at the signature level. await marks every suspension point. Intent is visible before reading the body.These were not minor annoyances. They were structural problems that made async Python harder to write, harder to read, and harder to maintain than it needed to be.
The Proposal: Making Coroutines a First-Class Concept
Yury Selivanov, a CPython core developer and PSF Fellow since 2013, lead maintainer of asyncio, and co-founder of MagicStack Inc. (later EdgeDB), authored PEP 492 on April 9, 2015. The proposal's opening line connected the motivation directly to the real world: the growth of Internet connectivity had triggered a proportionate need for responsive and scalable code.
But the real thesis was architectural. The PEP's goal was to make coroutines a proper standalone concept in Python and introduce new supporting syntax, with the ultimate aim of establishing a common, easily approachable mental model of asynchronous programming — one as close to synchronous programming as possible.
That phrase — "as close to synchronous programming as possible" — was the key design insight. The best async syntax is the one that looks almost identical to the synchronous code developers already know how to write.
Selivanov described the personal frustration behind the proposal in a 2021 interview with Mouse Vs Python: his team was using asyncio heavily, but Python had no asynchronous with block, and yield from felt syntactically wrong for the purpose. That friction drove him to propose async/await directly. (Mouse Vs Python, October 2021)
The PEP introduced four new syntactic constructs:
async defdeclares a native coroutine function. Unlike generator-based coroutines, a function declared withasync defis always a coroutine, even if its body contains noawaitexpressions.awaitsuspends execution of the enclosing coroutine until the awaited object completes. It replacesyield fromfor coroutine delegation and explicitly validates that its argument is an awaitable.async withenables asynchronous context managers with__aenter__and__aexit__methods that can themselves be coroutines.async forenables asynchronous iteration over objects that implement__aiter__and__anext__as coroutines.
But cataloging the syntax misses the deeper architecture of what each construct solved. These were not cosmetic changes to existing machinery — each one targeted a specific class of failure that the generator-based approach could not address.
The async def solution goes beyond readability. The generator-based model made coroutine-ness a property of the function body: remove the yield from, and the function silently became synchronous. This meant that refactoring, linting, and introspection tools could not reliably identify async functions without executing or deeply parsing them. async def moves coroutine-ness to the signature — a structural property that the parser, the type system, and every static analysis tool can detect immediately. The practical consequence is that an entire category of refactoring bugs becomes impossible: you cannot accidentally strip asyncness from a function without changing its declaration.
The await solution goes beyond replacing yield from. The deeper problem with yield from was that it was syntactically invisible to anyone reading code for concurrency behavior. A yield from db.fetch() looked like a generator delegation, not a scheduler hand-off. await encodes semantic intent: every occurrence is an explicit promise that execution may pause here and other work may proceed. This makes reasoning about concurrency local — you do not need to trace call chains to understand where a function can be interrupted. Van Rossum's insistence on syntactic suspension points was precisely this insight: the syntax should carry the concurrency contract, not hide it.
The async with solution addressed a structural impossibility, not just a missing feature. Before PEP 492, there was genuinely no way to acquire and release a resource asynchronously inside a with block. The synchronous __enter__ and __exit__ protocol cannot call yield from. This forced async code into patterns of manual try/finally cleanup that could not benefit from the guarantees that context managers provide — that cleanup runs even when exceptions are raised. async with restored those guarantees for async resources, which is why it became the foundation for database connection pools, HTTP session management, distributed locks, and nearly every async resource abstraction in the ecosystem.
The async for solution unlocked a programming model that was previously unavailable. Asynchronous data sources — paginated APIs, streaming database cursors, server-sent event streams, message queue consumers — require an I/O round-trip per iteration step. The synchronous __iter__/__next__ protocol blocks the event loop on each step, defeating the purpose of async I/O entirely. The manual alternative using __aiter__ and __anext__ as raw methods was technically possible but so verbose that it was rarely used correctly in practice. async for made asynchronous iteration a first-class idiom, which is why every mature async library now exposes its streaming interfaces through it.
The same database read from earlier became:
import asyncio
async def read_data(db):
data = await db.fetch('SELECT ...')
return data
Two things changed: @asyncio.coroutine became async def, and yield from became await. The result was immediately clearer, and — critically — the intent was now visible in the function signature rather than buried in the body.
The Debate: Why New Syntax Was the Whole Point
PEP 492 moved fast, but it was not without controversy. The ideas were first raised on the python-ideas mailing list in mid-April 2015, enthusiastically embraced by many contributors, and accepted by Guido van Rossum by May 5.
One significant objection came from Mark Shannon, who argued that new syntax was unnecessary since the existing language could already express everything PEP 492 proposed. Selivanov did not dispute this — the existing approach was technically functional. The argument for PEP 492 was not about capability but about clarity.
Van Rossum settled this decisively on the python-dev mailing list:
Van Rossum was unambiguous: new syntax was the entire point. He wanted suspension points to be syntactically identifiable — not inferred from a decorator or the presence of yield from, but declared in the code itself. — Guido van Rossum, python-dev, May 2015
This statement reveals the deeper philosophy behind PEP 492. Van Rossum wanted suspension points to be syntactically visible. When you read an async def function and see await, you know immediately that this is a point where execution may pause and other tasks may run. You cannot alias it, hide it behind a function call, or abstract it away. The concurrency behavior is declared in the syntax itself.
There was also bikeshedding around keyword ordering — some community members preferred def async over async def. The precedence of await was debated at length. Unlike yield and yield from, which have the lowest operator precedence in Python, await was given high precedence, sitting between exponentiation and subscripting/calls. This means await expressions do not need parentheses in the common cases, making the code cleaner. Van Rossum himself requested this adjustment explicitly during the review period, noting it would make the most common usages writable without extra parentheses.
The competing PEP 3152, Greg Ewing's "Cofunctions" proposal, was considered as an alternative. It would have introduced codef and cocall keywords. Van Rossum rejected this approach on the python-dev list, stating that he could not accept PEP 3152's design and that await should be understood as a refinement of yield from rather than a separate mechanism. (LWN.net, May 2015)
Under the Hood: Native Coroutines as a Distinct Type
A critical technical decision in PEP 492 was making native coroutines their own type, completely separate from generators. This was not the original plan. The initial implementation treated native coroutines as a special kind of generator, but feedback from the Python 3.5 beta — specifically from the Tornado web framework team — revealed that this approach created integration problems.
The PEP documents the redesign: rather than being a new kind of generator, native coroutines became their own completely distinct type. This separation had important consequences:
import inspect
import types
async def my_coroutine():
await asyncio.sleep(1)
def my_generator():
yield 1
coro = my_coroutine()
gen = my_generator()
print(inspect.iscoroutine(coro)) # True
print(inspect.isgenerator(coro)) # False
print(inspect.iscoroutine(gen)) # False
print(inspect.isgenerator(gen)) # True
# You cannot accidentally iterate a coroutine
# list(coro) # TypeError: 'coroutine' object is not iterable
Native coroutine objects raise TypeError when you try to call __iter__ or __next__ on them. You cannot pass them to iter(), tuple(), list(), or use them in a for loop. The type system itself now prevents an entire class of bugs.
Internally, two new code object flags were introduced: CO_COROUTINE marks native coroutines defined with async def, and CO_ITERABLE_COROUTINE marks generator-based coroutines decorated with types.coroutine(). This allowed backward compatibility with existing asyncio code while clearly distinguishing the two paradigms.
If you see a RuntimeWarning: coroutine 'X' was never awaited in your code, it means you called an async def function without await. The function returned a coroutine object that was never scheduled to run. Always await native coroutines, or pass them to asyncio.create_task().
async with: Asynchronous Context Managers in Practice
One of PEP 492's most immediately useful features was async with, which enabled asynchronous context managers. Before PEP 492, there was simply no way to perform asynchronous setup and teardown in a with block.
The async context manager protocol requires two methods:
class AsyncDatabaseConnection:
def __init__(self, dsn: str) -> None:
self.dsn = dsn
self._conn = None
async def __aenter__(self):
self._conn = await connect(self.dsn)
return self._conn
async def __aexit__(self, exc_type, exc_val, exc_tb):
await self._conn.close()
return False
Usage looks virtually identical to synchronous context managers:
async def get_user(user_id: int) -> dict:
async with AsyncDatabaseConnection("postgresql://localhost/mydb") as conn:
row = await conn.fetchrow(
"SELECT * FROM users WHERE id = $1", user_id
)
return dict(row)
The contextlib module was also updated. contextlib.asynccontextmanager lets you write async context managers as generators, which is the preferred pattern for managing database transactions, HTTP sessions, file handles, and any other resource that requires asynchronous acquisition and release:
from contextlib import asynccontextmanager
@asynccontextmanager
async def managed_transaction(conn):
tx = conn.transaction()
await tx.start()
try:
yield tx
await tx.commit()
except Exception:
await tx.rollback()
raise
async for: Asynchronous Iteration
The async for syntax enabled asynchronous iteration protocols through __aiter__ and __anext__. This unlocked a pattern that was essentially impossible with generator-based coroutines: iterating over a data source where each step requires an asynchronous operation.
class AsyncPaginator:
"""Asynchronously iterate through paginated API results."""
def __init__(self, client, endpoint: str, page_size: int = 100):
self.client = client
self.endpoint = endpoint
self.page_size = page_size
self._page = 0
self._buffer: list = []
self._exhausted = False
def __aiter__(self):
return self
async def __anext__(self):
if not self._buffer:
if self._exhausted:
raise StopAsyncIteration
response = await self.client.get(
self.endpoint,
params={"page": self._page, "size": self.page_size},
)
data = response.json()
if not data["results"]:
raise StopAsyncIteration
self._buffer = data["results"]
self._exhausted = not data.get("has_next", False)
self._page += 1
return self._buffer.pop(0)
async def process_all_users(client):
async for user in AsyncPaginator(client, "/api/users"):
await process_user(user)
The PEP introduced StopAsyncIteration as a new built-in exception, deliberately distinct from StopIteration. A StopIteration raised inside a coroutine could be confused with one being used to signal the end of a generator, creating subtle and hard-to-debug interactions. A separate exception type eliminates this ambiguity entirely.
The PEPs That Built on PEP 492
PEP 492 was not an endpoint — it was a foundation. Several subsequent PEPs extended the async ecosystem it created.
send(), throw(), close() — generators became two-way channels.yield from for chaining coroutines.@asyncio.coroutine.async def, await, async with, async for. Native coroutines as a distinct type. Authored by Yury Selivanov; accepted in 26 days.contextvars). Task-local storage that works correctly in async code where thread-locals are meaningless.except*. Multiple concurrent exceptions; prerequisite for asyncio.TaskGroup.async with.PEP 525 — Asynchronous Generators (Yury Selivanov, Python 3.6): PEP 492 explicitly excluded asynchronous generators from its scope, noting them as an advanced concept requiring careful design. PEP 525 addressed this, enabling functions that use both async def and yield:
async def ticker(delay: float, to: int):
"""Yield incrementing integers with async delays."""
for i in range(to):
yield i
await asyncio.sleep(delay)
async def main():
async for value in ticker(0.5, 10):
print(value)
The PEP reported that asynchronous generators were 2x faster than equivalent implementations using the __aiter__/__anext__ class-based protocol, making them the preferred approach for async data producers.
PEP 530 — Asynchronous Comprehensions (Yury Selivanov, Python 3.6): This extended Python's comprehension syntax to support async for and await expressions inside list, set, and dict comprehensions:
# Async comprehension with async for
results = [row async for row in cursor.execute("SELECT * FROM users")]
# Await inside a comprehension
statuses = [await check_health(s) for s in servers]
# Combined
data = {
key: value
async for key, value in async_key_value_source()
if value is not None
}
PEP 567 — Context Variables (Yury Selivanov, Python 3.7): Introduced contextvars, providing task-local storage that works correctly with async code. This solved the problem of thread-local storage being meaningless in an async context where many tasks run on a single thread.
PEP 654 — Exception Groups and except* (Irit Katriel, with Yury Selivanov and Guido van Rossum, Python 3.11): Enabled structured handling of multiple concurrent exceptions. This was essential for making asyncio.TaskGroup possible, as concurrent tasks can fail simultaneously and the existing exception model could only propagate one exception at a time.
Modern async Python: TaskGroup and Structured Concurrency
The culmination of the async evolution that PEP 492 started arrived in Python 3.11 with asyncio.TaskGroup. Built on PEP 654's exception groups, TaskGroup brings structured concurrency — a concept popularized by Nathaniel J. Smith's Trio library — into the standard library.
The old approach with asyncio.gather() was fragile. If one task failed, the others could leak or behave unpredictably. TaskGroup guarantees that all tasks complete or are cancelled before the block exits:
import asyncio
import httpx
async def fetch_url(client: httpx.AsyncClient, url: str) -> dict:
response = await client.get(url)
return {"url": url, "status": response.status_code}
async def fetch_all(urls: list[str]) -> list[dict]:
results = []
async with httpx.AsyncClient() as client:
async with asyncio.TaskGroup() as tg:
tasks = [
tg.create_task(fetch_url(client, url))
for url in urls
]
# We only reach here if ALL tasks succeeded.
# If any task raised, TaskGroup cancels the rest
# and raises an ExceptionGroup.
results = [task.result() for task in tasks]
return results
async def main():
urls = [
"https://httpbin.org/get",
"https://httpbin.org/delay/1",
"https://httpbin.org/status/200",
]
try:
results = await fetch_all(urls)
for r in results:
print(f"{r['url']}: {r['status']}")
except* httpx.HTTPError as eg:
for exc in eg.exceptions:
print(f"Request failed: {exc}")
asyncio.run(main())
The structured concurrency guarantee here is significant: no child task can outlive the TaskGroup scope. If one task raises an exception, siblings are cancelled, exceptions are aggregated into an ExceptionGroup, and the except* syntax lets you handle them cleanly by type.
Common Patterns and Practical Idioms
Here are the patterns that emerge from mature async Python codebases, all built on the foundation PEP 492 established.
Bounded concurrency with a semaphore
async def bounded_fetch(
urls: list[str],
max_concurrent: int = 10,
) -> list[httpx.Response]:
semaphore = asyncio.Semaphore(max_concurrent)
results: list[httpx.Response] = []
async def fetch_one(url: str) -> httpx.Response:
async with semaphore:
async with httpx.AsyncClient() as client:
return await client.get(url)
async with asyncio.TaskGroup() as tg:
tasks = [tg.create_task(fetch_one(url)) for url in urls]
return [t.result() for t in tasks]
Async generator pipeline
async def read_lines(path: str):
"""Async generator that reads lines from a file."""
async with aiofiles.open(path) as f:
async for line in f:
yield line.strip()
async def parse_records(lines):
"""Transform raw lines into structured records."""
async for line in lines:
if line and not line.startswith("#"):
yield json.loads(line)
async def process_file(path: str):
"""Compose async generators into a pipeline."""
lines = read_lines(path)
records = parse_records(lines)
async for record in records:
await save_to_database(record)
Timeout management
async def resilient_request(url: str, timeout_seconds: float = 5.0):
"""Make a request with structured timeout handling."""
try:
async with asyncio.timeout(timeout_seconds):
async with httpx.AsyncClient() as client:
return await client.get(url)
except TimeoutError:
return None # Or retry, log, raise a domain exception, etc.
The Ongoing Evolution
PEP 492's influence continues to shape Python's development, with several active proposals that trace directly back to the decisions made in 2015.
PEP 789, co-authored by Zac Hatfield-Dodds and Nathaniel Smith and presented at the 2024 Python Language Summit, addresses a subtle but serious hazard: using yield to suspend a frame inside a cancel scope such as TaskGroup or asyncio.timeout. When a generator yields inside one of these scopes, the wrong task can be cancelled, timeouts can be silently ignored, and exceptions can be misrouted. The proposal introduces a new sys.prevent_yields() context manager that cancel-scope implementations can use to make such misuse a RuntimeError at runtime. The problem does not exist solely in async generators — Trio's synchronous cancel scopes are affected too — which makes a simple deprecation of async generators insufficient. (PEP 789)
PEP 806, proposed during the PyCon 2025 sprints, targets a different friction point: the verbosity of mixing synchronous and asynchronous context managers. Today, code that needs one async with and two plain with blocks must nest them, or use AsyncExitStack. PEP 806 proposes syntactic sugar that lets individual context managers within a single with statement be marked async, desugaring to the same nested form with zero runtime overhead. (PEP 806)
PEP 828, targeting Python 3.15, proposes adding support for yield from inside asynchronous generator functions, an omission from PEP 525 that has forced async generator authors into manual workarounds since 2016. (PEP 828)
Meanwhile, the free-threading work that began with PEP 703 has moved from proposal to reality faster than many expected. Python 3.13 (released October 2024) shipped an experimental free-threaded build with the GIL disabled. In June 2025, the Steering Council accepted PEP 779, advancing the free-threaded interpreter to officially supported — though non-default — status starting with Python 3.14. The community discussion at the 2025 Language Summit suggests the path to making free-threading the default involves further ecosystem adoption, not a fundamental change of direction. (PEP 703)
Does widespread free-threading make cooperative async concurrency obsolete? The answer emerging from the community is no. The explicit suspension points that van Rossum insisted on in 2015 provide guarantees that implicit thread scheduling cannot: you can see every point where a coroutine might pause, reason about shared state between those points, and use structured concurrency to enforce task lifetimes. These properties are complementary to, not replaced by, the ability to run threads in true parallel.
Van Rossum himself, in a 2025 ODBMS Industry Watch interview, described async IO as one of the areas he personally remains focused on, alongside typing and interpreter performance. The syntax he argued for a decade ago has become the foundation for an entire ecosystem of async frameworks, from FastAPI and Starlette to aiohttp and the standard library's own evolving capabilities.
What PEP 492 Really Gave Us
The technical contributions of PEP 492 — async def, await, async with, async for — are documented in the specification. But the deeper contribution was philosophical. By giving asynchronous code its own syntax, Selivanov and van Rossum made a statement about how Python developers should think about concurrency: not as something hidden behind decorators and generators, but as something explicit, visible, and structurally distinct from synchronous code.
- Suspension points are syntactically visible. Every
awaitin anasync deffunction is a place where execution may pause. You cannot hide this, alias it away, or abstract it into invisibility. - Native coroutines are their own type. The type system enforces the distinction between coroutines and generators, preventing an entire class of refactoring bugs that plagued the
yield fromera. - The async protocol is extensible.
async withandasync forwork on any object that implements the corresponding dunder methods, making the pattern composable across the entire ecosystem. - Structured concurrency became possible. The explicit nature of async Python made it possible to reason about task lifetimes, which is the foundation for
TaskGroupand the broader structured concurrency movement.
That is the same principle that runs through the Zen of Python: explicit is better than implicit. PEP 492 is the Zen of Python applied to concurrency, and a decade on, it remains one of the most consequential PEPs in the language's history.
PEPs Referenced in This Article:
- PEP 342 — Coroutines via Enhanced Generators (Python 2.5)
- PEP 380 — Syntax for Delegating to a Subgenerator (Python 3.3)
- PEP 3152 — Cofunctions (rejected; Greg Ewing)
- PEP 3156 — Asynchronous IO Support: the "asyncio" Module (Python 3.4; Guido van Rossum)
- PEP 492 — Coroutines with async and await syntax (Python 3.5; Yury Selivanov)
- PEP 525 — Asynchronous Generators (Python 3.6; Yury Selivanov)
- PEP 530 — Asynchronous Comprehensions (Python 3.6; Yury Selivanov)
- PEP 567 — Context Variables (Python 3.7; Yury Selivanov)
- PEP 654 — Exception Groups and
except*(Python 3.11) - PEP 703 — Making the Global Interpreter Lock Optional (experimental in Python 3.13; officially supported in Python 3.14 via PEP 779)
- PEP 779 — Criteria for supported status for free-threaded Python (accepted June 2025)
- PEP 789 — Preventing task-cancellation bugs by limiting yield in async generators (Zac Hatfield-Dodds & Nathaniel Smith; presented at 2024 Language Summit)
- PEP 806 — Mixed sync/async context managers with precise async marking (proposed PyCon 2025)
- PEP 828 — Supporting 'yield from' in asynchronous generators (targeting Python 3.15)