When Python 3.5 introduced async and await in September 2015, it transformed the way developers wrote concurrent code. But something was conspicuously missing. You could write async for loops, define async with context managers, and await coroutines -- yet none of that worked inside a list comprehension. PEP 530 fixed that gap.
The Problem PEP 530 Solved
Consider the following pattern that async Python developers hit constantly before Python 3.6:
Click the lettered badges to see what each part contributes to the cognitive load of the before pattern.
The Backstory: How We Got Here
Why did Python need three separate PEPs over two years to get here -- and why did it take this long?
PEP 530 did not arrive in isolation. It was the culmination of a deliberate, multi-year effort to build first-class async support into Python's syntax, driven almost entirely by one person: Yury Selivanov, a CPython core developer and co-founder of MagicStack (the company behind uvloop, asyncpg, and what is now known as Gel, formerly EdgeDB).
The timeline of async PEPs tells the story:
PEP 492 -- Coroutines with async and await syntax (Python 3.5, 2015). This was the foundational PEP. Created by Selivanov on April 9, 2015, PEP 492 introduced async def, await, async for, and async with as first-class language constructs. Before PEP 492, Python coroutines were built on top of generators using yield from, which was confusing and error-prone. The rationale section of PEP 492 frames the goal as making async programming in Python share a mental model with synchronous programming -- familiar, approachable, and as structurally similar as the language would allow. PEP 492 was accepted by Guido van Rossum on May 5, 2015.
PEP 525 -- Asynchronous Generators (Python 3.6, 2016). PEP 492 explicitly left async generators out of scope, deferring them to a separate PEP. PEP 525, also authored by Selivanov, filled that gap by allowing yield inside async def functions. This was a prerequisite for PEP 530, since asynchronous comprehensions needed async generators to iterate over. Performance was a strong motivator: PEP 525 documents that in testing of the reference implementation, "asynchronous generators are 2x faster than an equivalent implemented as an asynchronous iterator."
PEP 530 -- Asynchronous Comprehensions (Python 3.6, 2016). The natural conclusion. If you have async for and async generators, you should be able to combine them with comprehension syntax.
In a 2021 Q&A published on Mouse Vs Python, Selivanov explained how his practical work drove these contributions. He described how using asyncio heavily in production -- specifically while building what became Gel -- revealed friction points: Python lacked async context managers, and yield from felt unnatural, which led him to propose async/await. Each subsequent PEP solved a real problem his team encountered while building production async software. The acknowledgments section of PEP 530 itself names three people who shaped the PEP: Guido van Rossum, Victor Stinner, and Elvis Pranskevichus -- Selivanov's MagicStack co-founder, who reviewed the code and contributed to the discussions around the PEP.
The Sprint That Made It Happen
What conditions allowed a PEP to go from proposal to accepted implementation in under a week?
PEP 530 was written and accepted with remarkable speed. Created on September 3, 2016, it was accepted by Guido van Rossum just three days later on September 6 -- during the CPython core developer sprint held at Instagram's offices in California.
Guido's acceptance came with a notable caveat. In his message to the python-dev mailing list on September 6, 2016, he acknowledged the tight timeline -- the proposal arrived close to the beta 1 feature freeze -- but noted the ideas were a natural continuation of the async/await design already accepted in 3.5, which made them easy to evaluate quickly. He accepted PEP 530 provisionally, requiring a working implementation signed off by at least one other core developer.
Victor Stinner, a fellow core developer who participated in that same sprint, wrote on his blog that the week was "the most productive CPython week ever," crediting having Guido van Rossum in the room as a key factor in getting PEPs accepted and implementations merged. The official Python Software Foundation blog post about the sprint confirms it was sponsored by Instagram, Microsoft, and the PSF, and notes that the week of September 4th saw more commits than the preceding seven weeks combined. Stinner personally reviewed the C implementation alongside Selivanov at the sprint. The implementation was completed and merged within days, landing in Python 3.6.0 beta 1 on September 12, 2016 -- just six days after acceptance. Python 3.6.0 final was released on December 23, 2016.
The Specification: What PEP 530 Actually Added
Specifically, what two things changed -- and why does the distinction between them matter in production code?
PEP 530 introduced two distinct capabilities. Understanding the difference between them matters.
Capability 1: Async Comprehensions with async for
You can use async for inside list, set, dict comprehensions, and generator expressions. The iterable must be an asynchronous iterable -- an object implementing __aiter__ and __anext__.
import asyncio
async def async_range(n):
"""An async generator that yields numbers 0 through n-1."""
for i in range(n):
await asyncio.sleep(0.01) # simulate async work
yield i
async def main():
# Async list comprehension
squares = [i ** 2 async for i in async_range(10)]
print(squares)
# Output: [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
# Async set comprehension
even_set = {i async for i in async_range(10) if i % 2 == 0}
print(even_set)
# Output: {0, 2, 4, 6, 8}
# Async dict comprehension
mapping = {i: i ** 2 async for i in async_range(5)}
print(mapping)
# Output: {0: 0, 1: 1, 2: 4, 3: 9, 4: 16}
# Async generator expression
gen = (i ** 2 async for i in async_range(5))
async for value in gen:
print(value, end=" ")
# Output: 0 1 4 9 16
asyncio.run(main())
Each of these forms iterates asynchronously. The event loop can run other tasks during each await within the async iterator. This is not parallelism -- each item is still processed sequentially -- but it does allow the program to avoid blocking while waiting on I/O.
Capability 2: await in Comprehensions
Separately from async for, PEP 530 allows the await keyword inside any comprehension, as long as the comprehension exists within an async def function:
import asyncio
async def fetch_value(x):
"""Simulate fetching a value asynchronously."""
await asyncio.sleep(0.01)
return x * 10
async def main():
funcs = [fetch_value(1), fetch_value(2), fetch_value(3)]
# Await each coroutine sequentially inside a list comprehension
results = [await f for f in funcs]
print(results)
# Output: [10, 20, 30]
asyncio.run(main())
Note that this uses a regular for, not async for. The await keyword handles the async part by suspending the coroutine for each item. The iterable (funcs) is a regular synchronous list.
Combining Both
You can combine async for with await, and mix them with regular if and for clauses:
import asyncio
async def async_range(n):
for i in range(n):
await asyncio.sleep(0.01)
yield i
async def process(value):
await asyncio.sleep(0.01)
return value * 100
async def main():
# async for + await + if filter
results = [await process(i) async for i in async_range(10) if i % 3 == 0]
print(results)
# Output: [0, 300, 600, 900]
asyncio.run(main())
PEP 530 also allows nested clauses that mix synchronous and asynchronous iteration:
dataset = {data for line in aiter()
async for data in line
if check(data)}
This structure iterates line synchronously, then iterates data asynchronously within each line, filtering with check().
Which Clause Makes It Async?
PEP 530 introduced a specific rule that the article so far has left implicit: a comprehension is classified as an asynchronous comprehension if and only if its outermost for clause is an async for. Inner clauses can be async for or regular for regardless. But if the outermost clause is a regular for, the comprehension is treated as synchronous even if inner clauses use async for.
The outermost clause rule is easy to state and easy to misapply. Test a few patterns below before reading the code examples.
Select a clause pattern to see whether Python accepts or rejects it, and why.
This matters because Python runs comprehensions in an implicitly created scope. Asynchronous comprehensions get an async-capable scope; synchronous comprehensions do not. If a synchronous outer scope tries to drive an inner async for, Python has nowhere to send the suspension, and you get a SyntaxError.
The rule in practice:
import asyncio
async def async_range(n):
for i in range(n):
await asyncio.sleep(0)
yield i
async def main():
# Outermost clause is async for -- asynchronous comprehension. Works.
result = [i async for i in async_range(5)]
# Outermost clause is a regular for, inner is async for -- SyntaxError.
# result = [j async for j in async_range(i) for i in range(3)] # wrong order
# Outermost is async for, inner is regular for -- fine.
# The async outer scope covers the whole comprehension.
result2 = [
j
async for i in async_range(3)
for j in range(i)
]
print(result) # [0, 1, 2, 3, 4]
print(result2) # [0, 0, 1, 0, 1, 2]
asyncio.run(main())
The pattern that catches people is reversing the clause order: putting the sync for outermost and the async for as an inner clause. Python will raise a SyntaxError because the outer synchronous scope cannot drive the asynchronous inner clause. Flip the order -- async for outermost -- and it works. This is why PEP 530 included explicit grammar examples showing the nested mixed case with the asynchronous clause leading.
The Grammar Change
The actual grammar change was minimal. PEP 530 modified a single production rule in Python's grammar:
comp_for: [ASYNC] 'for' exprlist 'in' or_test [comp_iter]
The addition of the optional ASYNC token before for was the entire syntax change. The comprehension AST node gained a new is_async argument to track whether a comprehension clause was asynchronous. The small size of the grammar change reflects how naturally async comprehensions fit into Python's existing syntax -- which was, of course, the point.
A Critical Restriction
Asynchronous comprehensions (those using async for) are only valid inside async def functions. If you try to write one at the top level or inside a regular function, Python will raise a SyntaxError:
# This will NOT work
result = [i async for i in some_aiter()] # SyntaxError
In Python 3.6 specifically, there was an additional nuance: because async and await were still "soft keywords" -- not fully reserved -- asynchronous generator expressions were also restricted to async def bodies. PEP 530 noted that this restriction would be removed once async and await became reserved keywords in Python 3.7.
Reading the Error Messages
The SyntaxError shown above is just one of the failure modes you'll hit with async comprehensions. Knowing what each error means saves a lot of searching.
These are the four error patterns you're most likely to encounter when first working with async comprehensions.
SyntaxError: asynchronous comprehension outside of an asynchronous function
You used async for inside a comprehension, but the comprehension is not inside an async def function. The fix is always to move the comprehension inside a coroutine.
import asyncio
async def async_range(n):
for i in range(n):
await asyncio.sleep(0)
yield i
# SyntaxError -- module level, no async def
result = [i async for i in async_range(5)]
# Fix: wrap in a coroutine
async def main():
result = [i async for i in async_range(5)]
print(result)
asyncio.run(main())
TypeError: 'async_generator' object is not iterable
You used a regular for to iterate over an async generator. Python's synchronous iteration protocol cannot drive __anext__. The fix is to use async for, or to call asyncio.run() to collect the results first.
import asyncio
async def async_range(n):
for i in range(n):
await asyncio.sleep(0)
yield i
async def main():
# TypeError -- async generator needs async for
result = [i for i in async_range(5)] # wrong
# Fix
result = [i async for i in async_range(5)] # correct
print(result)
asyncio.run(main())
TypeError: object list can't be used in 'await' expression
You passed a non-awaitable into an await expression inside a comprehension. await requires a coroutine, a Future, or another object with __await__. A regular list, integer, or synchronous function call does not qualify.
import asyncio
def sync_double(x):
return x * 2 # not a coroutine
async def async_double(x):
await asyncio.sleep(0)
return x * 2
async def main():
items = [1, 2, 3]
# TypeError -- sync_double returns an int, not an awaitable
result = [await sync_double(i) for i in items] # wrong
# Fix -- use the async version
result = [await async_double(i) for i in items]
print(result) # [2, 4, 6]
asyncio.run(main())
AttributeError: __aiter__ or __anext__
You tried to use async for with an object that does not implement the asynchronous iterator protocol. This often happens when you pass a regular list or generator where an async generator is expected. The object needs to define both __aiter__ and __anext__ -- or be produced by an async def function that uses yield.
import asyncio
async def main():
regular_list = [1, 2, 3]
# AttributeError: 'list' object has no attribute '__aiter__'
result = [i async for i in regular_list] # wrong
# Fix -- use regular for with a synchronous iterable
result = [i for i in regular_list]
print(result)
asyncio.run(main())
Real-World Usage: Where This Actually Matters
The value of async comprehensions becomes apparent in I/O-bound scenarios where you're pulling data from an external source that delivers results incrementally.
Processing Database Results
import asyncio
async def fetch_rows(query):
"""Simulate an async database cursor."""
fake_data = [
{"id": 1, "name": "Alice", "active": True},
{"id": 2, "name": "Bob", "active": False},
{"id": 3, "name": "Charlie", "active": True},
]
for row in fake_data:
await asyncio.sleep(0.01) # simulate network latency
yield row
async def main():
active_users = [
row["name"] async for row in fetch_rows("SELECT * FROM users")
if row["active"]
]
print(active_users)
# Output: ['Alice', 'Charlie']
asyncio.run(main())
Without PEP 530, that one-liner becomes five or six lines of loop, conditional, and append logic. In a codebase with hundreds of such patterns, the reduction in visual noise is substantial.
Collecting API Responses
import asyncio
async def paginated_api(endpoint, pages=3):
"""Simulate paginated API responses."""
for page in range(1, pages + 1):
await asyncio.sleep(0.05)
yield {"page": page, "items": [f"item_{page}_{i}" for i in range(3)]}
async def main():
all_items = [
item
async for response in paginated_api("/api/data")
for item in response["items"]
]
print(all_items)
# Output: ['item_1_0', 'item_1_1', 'item_1_2', 'item_2_0', ...]
asyncio.run(main())
This pattern -- iterating asynchronously over pages, then synchronously over items within each page -- is exactly the kind of mixed iteration that PEP 530's grammar supports cleanly.
Typing Async Iterables
Production code rarely leaves types unannotated, and async iterables have a specific set of types in the standard library. Knowing which one to reach for avoids subtle errors and makes function signatures self-documenting.
The three types that matter most are in collections.abc (and mirrored in typing for compatibility with older Python):
| Type | What it represents | Methods required |
|---|---|---|
AsyncIterable[T] |
Anything you can use with async for |
__aiter__ |
AsyncIterator[T] |
An iterable that also manages its own state | __aiter__, __anext__ |
AsyncGenerator[YieldType, SendType] |
An async def function using yield |
__aiter__, __anext__, asend, athrow, aclose |
For function parameters, AsyncIterable[T] is the right annotation when your function only needs to iterate -- it's the most permissive type and accepts both AsyncIterator and AsyncGenerator. Use AsyncGenerator[T, None] when annotating the return type of an async def function that yields values (the second type parameter is the send type; None means the generator doesn't accept values via asend()).
import asyncio
from collections.abc import AsyncIterable, AsyncGenerator
# Return type: AsyncGenerator[int, None]
async def async_range(n: int) -> AsyncGenerator[int, None]:
for i in range(n):
await asyncio.sleep(0)
yield i
# Parameter type: AsyncIterable[int] -- accepts any async-iterable source
async def collect_evens(source: AsyncIterable[int]) -> list[int]:
return [i async for i in source if i % 2 == 0]
async def main():
result = await collect_evens(async_range(10))
print(result) # [0, 2, 4, 6, 8]
asyncio.run(main())
If you're building a class that should work with async for -- rather than using an async generator function -- you need to implement __aiter__ and __anext__ directly. __aiter__ returns self, and __anext__ is an async def method that raises StopAsyncIteration when the sequence is exhausted:
import asyncio
from collections.abc import AsyncIterator
class AsyncCounter(AsyncIterator[int]):
def __init__(self, stop: int) -> None:
self._current = 0
self._stop = stop
def __aiter__(self) -> "AsyncCounter":
return self
async def __anext__(self) -> int:
if self._current >= self._stop:
raise StopAsyncIteration
await asyncio.sleep(0)
value = self._current
self._current += 1
return value
async def main():
squares = [i ** 2 async for i in AsyncCounter(6)]
print(squares) # [0, 1, 4, 9, 16, 25]
asyncio.run(main())
In practice, writing a class that implements the protocol directly is far less common than using an async generator function -- async def with yield handles the vast majority of cases. You'd reach for the class form when you need to wrap an existing object that manages its own async state, or when you're building a library type that others will subclass.
A Common Misunderstanding: Sequential, Not Concurrent
If async means concurrent, why does an async comprehension still take N seconds to process N one-second operations?
One of the frequent mistakes with async comprehensions is assuming they provide concurrency. They do not. An async comprehension processes items sequentially, awaiting each one before moving to the next. If you need true multi-core parallelism rather than cooperative concurrency, that is a different problem entirely -- one addressed at the interpreter level by PEP 703 and free threading.
import asyncio
import time
async def slow_fetch(n):
await asyncio.sleep(1)
return n * 10
async def main():
start = time.time()
# This takes ~5 seconds, NOT ~1 second
results = [await slow_fetch(i) for i in range(5)]
elapsed = time.time() - start
print(f"Results: {results}")
print(f"Time: {elapsed:.1f}s")
# Output: Time: 5.0s
asyncio.run(main())
If you need true concurrency, use asyncio.gather() (all Python versions with asyncio) or asyncio.TaskGroup (Python 3.11+, added by PEP 654). TaskGroup is the modern preferred approach because it handles cancellation and exception propagation more reliably than gather(). The comprehension form is for readability and convenience when you specifically want sequential processing -- each item awaited before the next begins.
async def main():
start = time.time()
# This takes ~1 second -- truly concurrent
results = await asyncio.gather(*(slow_fetch(i) for i in range(5)))
elapsed = time.time() - start
print(f"Results: {list(results)}")
print(f"Time: {elapsed:.1f}s")
# Output: Time: 1.0s
asyncio.run(main())
If you are on Python 3.11 or later, asyncio.TaskGroup is the preferred approach. It handles cancellation and exception propagation more cleanly than gather():
import asyncio
import time
async def slow_fetch(n):
await asyncio.sleep(1)
return n * 10
async def main():
start = time.time()
results = []
# Python 3.11+ -- TaskGroup for concurrent execution
async with asyncio.TaskGroup() as tg:
tasks = [tg.create_task(slow_fetch(i)) for i in range(5)]
results = [t.result() for t in tasks]
elapsed = time.time() - start
print(f"Results: {results}")
print(f"Time: {elapsed:.1f}s")
# Output: Time: 1.0s
asyncio.run(main())
What Happens When an Exception Is Raised
If the comprehension stops early due to an exception, does the async generator's cleanup code run -- and when?
The article so far has treated async comprehensions as if they always run to completion. But what happens if an exception is raised partway through?
When an exception escapes an async comprehension, Python abandons the iteration. The async generator that was supplying items does not automatically receive a signal to clean up. Async generators implement an aclose() method -- the asynchronous equivalent of close() on a synchronous generator -- which throws a GeneratorExit into the generator body so that any async with blocks or try/finally clauses inside it can run. If that aclose() is never awaited, cleanup code never runs.
Python's asyncio event loop mitigates this in most cases. Since Python 3.6, the garbage collector hooks for async generators call aclose() as part of collection, and asyncio registers a finalizer that schedules the call on the event loop. But "eventually collected" is not the same as "immediately cleaned up." If your async generator is holding a database connection, a file handle, or a network socket, relying on the garbage collector to release it is a bad practice.
The pattern that makes cleanup deterministic is wrapping the async generator in an async with block when it manages resources:
import asyncio
async def risky_source():
"""An async generator that holds a resource."""
resource = "open"
try:
for i in range(10):
await asyncio.sleep(0)
yield i
finally:
resource = "closed"
print(f"Resource: {resource}")
async def main():
# If an exception is raised at i == 3, the generator is abandoned.
# The finally block runs when the generator is eventually garbage-collected,
# not immediately when the exception occurs.
try:
result = [i async for i in risky_source() if 10 / (5 - i) > 0]
except ZeroDivisionError:
print("Caught exception -- generator may still be open")
# Force cleanup by driving it manually with try/finally, or use a context manager
# wrapper if your generator is built to support one.
asyncio.run(main())
This is one of the reasons PEP 525 (asynchronous generators) included an aclose() method and specified the finalizer hooks. Async comprehensions compose with async generators cleanly -- but the same resource-management discipline that applies to async for loops applies to the comprehensions that use them.
The practical rule: if the async iterable you're consuming in a comprehension owns a resource, make sure its finally block or async with teardown is guaranteed to run. That guarantee comes from either the event loop finalizer (acceptable for short-lived scripts) or from explicit aclose() calls in your own exception handling (required for production code).
When a Comprehension Is Too Much
The opening section of this article showed the before-and-after that made PEP 530 compelling: four lines of loop-and-append replaced by one line of comprehension. That readability argument is real. It is also possible to take it too far.
There is no hard rule about when a comprehension becomes unreadable. But a few signals are reliable:
The first is multiple async for clauses. One outer async for with an inner synchronous for is still readable -- the paginated API example in the real-world section above is a good illustration of this. Two or more async for clauses in the same expression is a strong signal that the logic should be extracted into an explicit loop.
The second is an await call that does non-trivial work. Awaiting a simple fetch is fine. Awaiting a function that itself contains branching logic obscures what the comprehension is doing. When the awaitables are doing real work, an explicit loop with a comment reads more honestly.
The third is filtering with a complex condition. An if clause in a comprehension is idiomatic. An if clause that calls another awaitable is technically valid but visually dense enough to warrant a loop.
As a guideline: if the comprehension would require a comment to explain what it does, write the loop instead. Comprehensions are for cases where the logic is self-evident from structure. Async code is often doing enough that the "self-evident" threshold arrives earlier than it does with synchronous code.
import asyncio
async def is_valid(item):
await asyncio.sleep(0)
return item % 2 == 0
async def transform(item):
await asyncio.sleep(0)
return item * 100
async def async_source(n):
for i in range(n):
await asyncio.sleep(0)
yield i
async def main():
# Readable: one async for, one simple transformation
squares = [i ** 2 async for i in async_source(6)]
# Harder to scan: await in the filter AND in the expression
# results = [await transform(i) async for i in async_source(10) if await is_valid(i)]
# Clearer as a loop when both the filter and expression are awaited
results = []
async for i in async_source(10):
if await is_valid(i):
results.append(await transform(i))
print(squares) # [0, 1, 4, 9, 16, 25]
print(results) # [0, 200, 400, 600, 800]
asyncio.run(main())
Note that the commented-out one-liner is valid Python -- PEP 530 permits await in both the expression and the filter clause. The argument against it is not correctness, it is the cognitive overhead of parsing two suspension points in a single expression. Whether that tradeoff is acceptable depends on how often the code is read and by whom.
What This Looks Like with Real Libraries
Every code example in this article uses asyncio.sleep() to simulate async work. That is unavoidable in a tutorial -- it removes the dependency on any particular library. But it is worth being direct about the gap between the simulated pattern and what the pattern looks like in production.
Yury Selivanov wrote PEP 530 while building asyncpg and what became Gel (formerly EdgeDB). The friction he was solving was real: retrieving rows from a PostgreSQL cursor, streaming HTTP responses, reading lines from an async file handle. The async for in a comprehension is at its most useful when the iterable represents a live I/O stream -- something where each item genuinely suspends the coroutine waiting on the network or disk.
The following shows what the database example from the real-world section looks like when the simulated cursor is replaced with the actual asyncpg API:
# asyncpg -- PostgreSQL async driver
import asyncio
import asyncpg
async def main():
conn = await asyncpg.connect("postgresql://user:pass@localhost/mydb")
try:
active_users = [
row["name"]
async for row in conn.cursor("SELECT name, active FROM users")
if row["active"]
]
print(active_users)
finally:
await conn.close()
asyncio.run(main())
# aiofiles -- async file I/O
import asyncio
import aiofiles
async def main():
async with aiofiles.open("data.log", "r") as f:
error_lines = [
line.strip()
async for line in f
if "ERROR" in line
]
print(error_lines)
asyncio.run(main())
# aiohttp -- async HTTP client
import asyncio
import aiohttp
async def fetch_all(urls):
async with aiohttp.ClientSession() as session:
responses = [
await session.get(url)
for url in urls
]
# await each response body sequentially
bodies = [await r.text() for r in responses]
return bodies
asyncio.run(fetch_all(["https://example.com", "https://httpbin.org/get"]))
The aiofiles example shows the natural pairing of async with and an async comprehension: the context manager owns the file handle, and the comprehension iterates the file's lines asynchronously. This is the pattern PEP 492 and PEP 530 were designed to make expressible in a single, readable structure -- a resource managed by async with, iterated by an async comprehension inside it.
One note on the aiohttp example: notice that the comprehension uses await session.get(url) with a regular for, not async for. urls is a regular synchronous list. The await handles the async work for each item; the outer iteration is still synchronous. This is the Capability 2 pattern from the specification section -- and it is how await-in-comprehensions typically appears with HTTP client libraries, since you are driving a list of coroutines rather than consuming an async stream.
Related PEPs: The Full Async Ecosystem
PEP 530 is part of a broader family of PEPs that built Python's async capabilities piece by piece. Understanding the lineage helps clarify why PEP 530 was possible when it was.
Click any node on the timeline to see what each PEP actually changed and why it was a prerequisite for what came after.
| PEP | Python Version | What It Added |
|---|---|---|
| PEP 255 | 2.2 (2001) | yield and generators -- the ancestor of all generator-based async patterns |
| PEP 342 | 2.5 (2005) | send(), throw(), and close() on generators, enabling coroutine patterns |
| PEP 380 | 3.3 (2012) | yield from for generator delegation and pre-await coroutine chaining |
| PEP 3156 | 3.4 (2014) | The asyncio module itself -- event loop, transports, and protocols |
| PEP 492 | 3.5 (2015) | Dedicated async def, await, async for, async with syntax |
| PEP 525 | 3.6 (2016) | yield inside async def, enabling async generators |
| PEP 530 | 3.6 (2016) | async for and await inside comprehensions |
| PEP 567 | 3.7 (2018) | contextvars -- async-safe context state without manual plumbing |
Together, these PEPs represent a coherent vision: async Python should look and feel as natural as synchronous Python. Each PEP extended the async syntax to cover one more area where the synchronous language had an established pattern that the async world lacked.
Key Takeaways
- PEP 530 completed comprehension syntax for async code. It allowed
async forandawaitinside list, set, dict comprehensions and generator expressions -- eliminating a gap that had existed since Python 3.5. - Two distinct features, not one.
async forin a comprehension requires an async iterable.awaitin a comprehension requires only anasync defcontext and works with a regular synchronous iterable. - The outermost clause determines whether a comprehension is async. If the outermost
foris anasync for, the whole comprehension is asynchronous. Puttingasync foronly in an inner clause causes aSyntaxErrorbecause the outer synchronous scope cannot drive it. - There are four distinct error patterns. Async comprehension outside a coroutine, regular
forover an async generator,awaiton a non-awaitable, andasync forover a regular iterable -- each has a clear cause and a clear fix. - Use
AsyncIterable[T]for parameters,AsyncGenerator[T, None]for return types. Both live incollections.abc. For classes that need to supportasync for, implement__aiter__and__anext__-- but prefer an async generator function when possible. - Async comprehensions are sequential, not concurrent. Each item is awaited before the next begins. For concurrent execution, use
asyncio.gather()orasyncio.TaskGroup. - The grammar change was one line. Adding the optional
ASYNCtoken tocomp_forwas the entire syntax change, which shows how well async comprehensions fit Python's existing design. - PEP 530 was accepted in three days. The speed of acceptance -- at a CPython sprint at Instagram's offices in September 2016 -- reflects how uncontroversial and natural the addition was.
- Exception safety requires attention. If an exception escapes an async comprehension, the underlying async generator is abandoned. Its cleanup code runs when garbage-collected -- not immediately. Production code that owns resources should ensure
aclose()is called explicitly. - Know when to write the loop instead. Async comprehensions are most readable for one async clause with a simple filter or transformation. When both the filter and the expression involve
await, an explicit loop communicates intent more clearly. - In production, the async iterable is a library type. asyncpg cursors, aiofiles file handles, and aiohttp response streams are the real-world sources async comprehensions were designed for. The pattern of
async withowning the resource and an async comprehension consuming it inside is the idiomatic combination.
The arc from PEP 342's generator-based coroutines in 2005 to PEP 530's async comprehensions in 2016 represents over a decade of iteration on how Python handles concurrent programming. Each step made async code look a little more like the synchronous code Python developers already knew. PEP 530 was the step that brought comprehensions along for the ride.