Across Python tutorials covering resource management, one pattern comes up repeatedly: acquiring something before a block of code runs and releasing it afterward. Python's standard solution is the with statement, which delegates setup and teardown to a context manager. Writing a context manager the traditional way requires a class with __enter__ and __exit__ methods. For simple setup/teardown patterns, that is more structure than the problem demands. The @contextmanager decorator from contextlib lets you write the same thing as a generator function with a single yield. Code before the yield is the setup. Code after the yield is the teardown. The yielded value is what the as clause receives. This article covers the full contract of that yield, how exceptions flow through it, how to use the result as both a context manager and a function decorator, and when the class approach is still the better choice.
Think of @contextmanager as a pause button. When Python enters your with block, it runs your generator up to the yield — then pauses. Your code runs. When the block exits (normally or by exception), Python resumes the generator from the yield and runs the rest. The yield is the exact dividing line between setup and teardown.
yield. Setup code executes. __enter__ returns yielded value.with block body runs. Generator is paused at yield.__exit__ completes.How to Create a with Statement Context Manager Using @contextmanager
- Import
contextmanagerfromcontextlib. Addfrom contextlib import contextmanagerandfrom collections.abc import Iteratorto your imports. These are both part of the Python standard library — no installation required. - Write a generator function that yields exactly once. Define a regular
deffunction (orasync deffor async context managers). All setup code — acquiring resources, opening connections, saving state — goes before theyield. The value you yield becomes theasvariable in thewithstatement. All teardown code goes after theyield. - Wrap the
yieldintry/finally. Place theyieldinside atryblock and put resource cleanup in thefinallyclause. This guarantees teardown runs whether thewithblock exits normally or raises an exception. - Handle exceptions if needed. To suppress exceptions from the
withblock, catch them in anexceptclause and do not re-raise. To propagate them, re-raise explicitly withraiseor do not catch them at all. - Test all three paths. Write tests verifying that setup runs, teardown runs on a clean exit, and teardown still runs when the
withblock raises an exception. Usepytest.raisesfor the exception path to confirm that cleanup executes regardless of the error.
The Class Approach vs the Generator Approach
A traditional class-based context manager requires defining a class with two dunder methods. Here is a context manager that temporarily changes the working directory and restores it on exit:
import os
from types import TracebackType
class ChangeDirectory:
def __init__(self, new_dir: str) -> None:
self.new_dir = new_dir
self.old_dir: str | None = None
def __enter__(self) -> str:
self.old_dir = os.getcwd()
os.chdir(self.new_dir)
return self.new_dir
def __exit__(
self,
exc_type: type[BaseException] | None,
exc_val: BaseException | None,
exc_tb: TracebackType | None,
) -> None:
os.chdir(self.old_dir) # type: ignore[arg-type]
with ChangeDirectory("/tmp") as path:
print(os.getcwd()) # /tmp
print(os.getcwd()) # original directory
The same behavior expressed with @contextmanager:
from collections.abc import Iterator
from contextlib import contextmanager
import os
@contextmanager
def change_directory(new_dir: str) -> Iterator[str]:
old_dir = os.getcwd()
os.chdir(new_dir)
try:
yield new_dir
finally:
os.chdir(old_dir)
with change_directory("/tmp") as path:
print(os.getcwd()) # /tmp
print(os.getcwd()) # original directory
The generator version is significantly shorter. The setup (os.chdir(new_dir)) appears before the yield. The teardown (os.chdir(old_dir)) appears after the yield in the finally block. The yielded value (new_dir) is what the as clause receives. No class, no self, no exc_type/exc_val/exc_tb parameters to manage.
The Yield Contract
The generator function decorated with @contextmanager must yield exactly once. The yield divides the function into two phases:
@contextmanager
def blueprint():
# ---- SETUP PHASE (__enter__) ----
# Acquire resources, configure state, open connections.
# This runs when the 'with' block is entered.
yield value # 'value' is bound to the 'as' variable.
# The with block body executes here.
# ---- TEARDOWN PHASE (__exit__) ----
# Release resources, restore state, close connections.
# This runs when the 'with' block exits.
If the generator yields zero times, @contextmanager raises RuntimeError("generator didn't yield"). If it yields more than once, it raises RuntimeError("generator didn't stop"). This is a strict contract: one yield, always.
The yield must appear inside the try block, not before it. If you write the yield outside the try and an exception occurs in the with block, @contextmanager throws that exception into the generator at the yield point — but if the yield is already past, there is no except or finally to catch it, and your teardown code will not run. Placing the yield inside try is what gives finally its guarantee.
What you yield determines what as receives. If you yield nothing (yield with no value), the as variable receives None. This is fine when the context manager manages state but does not produce a resource for the caller to use:
import sys
from collections.abc import Iterator
from contextlib import contextmanager
from io import StringIO
@contextmanager
def suppress_stdout() -> Iterator[None]:
"""Suppress stdout output for the duration of the block."""
# Note: contextlib.redirect_stdout(StringIO()) covers this same pattern
# since Python 3.4. This example demonstrates yield with no value.
old_stdout = sys.stdout
sys.stdout = StringIO()
try:
yield # nothing to give the caller
finally:
sys.stdout = old_stdout
with suppress_stdout():
print("This will not appear")
print("This will appear")
Under the hood, @contextmanager creates a _GeneratorContextManager object. Its __enter__ calls next() on the generator to advance it to the yield. Its __exit__ either calls next() again (if no exception) or generator.throw(exception) (if an exception occurred in the with block). The generator.throw() method itself was introduced in PEP 342 (Python 2.5) specifically to enable this pattern. The @contextmanager decorator was formalised in PEP 343 — the original PEP even uses a @contextmanager generator example as the primary motivation for the with statement design. Both PEPs were authored for Python 2.5 (2006) by Guido van Rossum and Alyssa (Nick) Coghlan.
This context manager is meant to temporarily redirect a logger to a file and always restore it afterward. It has one bug. Which option correctly identifies it?
Exception Handling
When an exception occurs inside the with block, @contextmanager throws it into the generator at the yield point. This means you can catch, log, suppress, or re-raise exceptions using standard try/except syntax. For a broader look at how Python manages errors, see the guide to Python exception handling.
Guaranteed Cleanup With try/finally
Wrapping the yield in try/finally ensures teardown runs regardless of whether an exception occurs:
from collections.abc import Iterator
from contextlib import contextmanager
from typing import Any
@contextmanager
def database_transaction(connection: Any) -> Iterator[Any]:
cursor = connection.cursor()
try:
yield cursor
connection.commit()
except Exception:
connection.rollback()
raise
finally:
cursor.close()
If the with block completes without error, the code after yield calls commit(). If an exception occurs, the except block calls rollback() and re-raises the exception. Either way, the finally block closes the cursor.
Exceptions in Setup Code
The exception handling discussion above covers exceptions that occur inside the with block — after the yield. A separate case worth understanding is an exception that occurs in the setup code, before the yield.
If setup raises, the generator never reaches the yield. Because __enter__ never returns normally, Python never enters the with block and never calls __exit__. The exception propagates directly to the caller, just as it would from any ordinary function call. Code in the finally block does not run because the generator never yielded. This behaviour is consistent and predictable: anything you allocated before the point of failure needs to be cleaned up within the setup code itself, typically with its own try/finally:
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def two_step_setup(resource_a, resource_b) -> Iterator[tuple]:
a = resource_a.acquire() # if this raises, nothing to clean up yet
try:
b = resource_b.acquire() # if this raises, a must be released here
except Exception:
resource_a.release(a) # clean up a before propagating
raise
try:
yield (a, b)
finally:
resource_b.release(b)
resource_a.release(a)
The inner try/except around the second acquisition ensures that a partial setup never silently leaks a resource. The outer try/finally around the yield handles cleanup once both resources are live.
Suppressing Exceptions
If the generator catches an exception and does not re-raise it, the exception is suppressed. This is the generator equivalent of returning True from __exit__:
import logging
from contextlib import contextmanager
logger = logging.getLogger(__name__)
@contextmanager
def log_and_suppress(*exception_types):
"""Log exceptions of the given types and suppress them."""
try:
yield
except exception_types as e:
logger.error("Suppressed %s: %s", type(e).__name__, e)
# Not re-raising => exception is suppressed
with log_and_suppress(ValueError, TypeError):
int("not a number") # ValueError is logged and suppressed
print("Execution continues here")
If you write a bare yield without a try/finally and an exception occurs in the with block, the teardown code after yield will not execute. Always wrap yield in try/finally when cleanup must be guaranteed.
An exception occurs inside a with block. The generator's except clause catches it and does not re-raise. What happens?
Practical Patterns
Timing a Code Block
import time
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def timer(label: str = "block") -> Iterator[None]:
start = time.perf_counter()
try:
yield
finally:
elapsed = time.perf_counter() - start
print(f"[{label}] {elapsed:.6f}s")
with timer("sorting"):
data = sorted(range(1_000_000, 0, -1))
# [sorting] 0.072345s
Temporarily Overriding an Environment Variable
import os
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def env_var(name: str, value: str) -> Iterator[str]:
"""Temporarily set an environment variable, restoring on exit."""
original = os.environ.get(name)
os.environ[name] = value
try:
yield value
finally:
if original is None:
del os.environ[name]
else:
os.environ[name] = original
with env_var("API_KEY", "test-key-123") as key:
print(os.environ["API_KEY"]) # test-key-123
# API_KEY is restored to its original value (or removed)
Managed Database Connection
import sqlite3
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def db_connection(db_path: str) -> Iterator[sqlite3.Connection]:
conn = sqlite3.connect(db_path)
try:
yield conn
except Exception:
conn.rollback()
raise
else:
conn.commit()
finally:
conn.close()
with db_connection(":memory:") as conn:
conn.execute("CREATE TABLE users (id INTEGER, name TEXT)")
conn.execute("INSERT INTO users VALUES (1, 'Alice')")
row = conn.execute("SELECT * FROM users").fetchone()
print(row) # (1, 'Alice')
# Connection is committed and closed automatically
Using @contextmanager as a Function Decorator
Since Python 3.2, context managers created with @contextmanager inherit from ContextDecorator, which means they can be applied directly as function decorators. When used this way, the entire function body runs inside the context manager's with block:
import time
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def timer(label: str = "function") -> Iterator[None]:
start = time.perf_counter()
try:
yield
finally:
print(f"[{label}] {time.perf_counter() - start:.6f}s")
# Used as a context manager
with timer("inline"):
sum(range(1_000_000))
# Used as a function decorator
@timer("decorated")
def compute() -> int:
return sum(range(1_000_000))
compute()
# [inline] 0.012345s
# [decorated] 0.012678s
When used as a decorator, the yielded value is not accessible because there is no as clause. This makes the decorator form ideal for context managers that manage state (like timing or logging) but do not need to hand a resource to the caller.
Async Context Managers
For async with statements, contextlib provides @asynccontextmanager, which works identically to @contextmanager but with an async generator:
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from typing import Any
@asynccontextmanager
async def managed_session(url: str) -> AsyncIterator[Any]:
# session type depends on your HTTP library (e.g. aiohttp.ClientSession)
session = await create_session(url)
try:
yield session
finally:
await session.close()
# Usage:
async def fetch_data() -> Any:
async with managed_session("https://api.example.com") as session:
return await session.get("/data")
The async version was added in Python 3.7 and gained decorator support in Python 3.10. The async generator protocol that makes it possible was defined in PEP 525 (Python 3.6). All version facts are verifiable in the Python contextlib documentation.
Composing Context Managers
When a block of code needs more than one context manager active at the same time, Python provides two ways to compose them.
Multiple Managers on One Line
The with statement accepts a comma-separated list of context managers. They are entered left to right and exited right to left, giving the same nesting guarantee as two indented with statements but without the extra indentation:
import os
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def change_directory(path: str) -> Iterator[str]:
old = os.getcwd()
os.chdir(path)
try:
yield path
finally:
os.chdir(old)
@contextmanager
def env_override(key: str, value: str) -> Iterator[None]:
original = os.environ.get(key)
os.environ[key] = value
try:
yield
finally:
if original is None:
del os.environ[key]
else:
os.environ[key] = original
# Both active simultaneously — entered left to right, exited right to left
with change_directory("/tmp"), env_override("DEBUG", "1"):
print(os.getcwd()) # /tmp
print(os.environ["DEBUG"]) # 1
ExitStack for Dynamic Composition
When the number of context managers is not known until runtime — for example, opening a variable list of files — contextlib.ExitStack (added in Python 3.3) lets you register them programmatically. Each registered manager's __exit__ is called in LIFO order when the stack exits, regardless of exceptions:
from contextlib import ExitStack
file_paths = ["a.txt", "b.txt", "c.txt"]
with ExitStack() as stack:
handles = [stack.enter_context(open(p)) for p in file_paths]
for fh in handles:
print(fh.read())
# All three files are closed automatically, even if one read() raises
ExitStack also accepts plain callables via stack.callback(fn), which is useful for registering cleanup functions that are not context managers themselves.
Testing a @contextmanager Function
A context manager has three distinct paths to verify: the setup runs, the teardown runs on a clean exit, and the teardown runs when the with block raises. Pytest makes all three straightforward:
import os
import pytest
from collections.abc import Iterator
from contextlib import contextmanager
@contextmanager
def managed_env(key: str, value: str) -> Iterator[str]:
original = os.environ.get(key)
os.environ[key] = value
try:
yield value
finally:
if original is None:
del os.environ[key]
else:
os.environ[key] = original
def test_setup_sets_variable():
with managed_env("MY_VAR", "hello") as val:
assert val == "hello"
assert os.environ["MY_VAR"] == "hello"
def test_teardown_restores_on_clean_exit():
os.environ.pop("MY_VAR", None)
with managed_env("MY_VAR", "hello"):
pass
assert "MY_VAR" not in os.environ
def test_teardown_restores_after_exception():
os.environ.pop("MY_VAR", None)
with pytest.raises(RuntimeError):
with managed_env("MY_VAR", "hello"):
raise RuntimeError("something went wrong")
assert "MY_VAR" not in os.environ
The three test functions map directly to the three paths through any @contextmanager function. If your context manager yields a value, test the yielded object directly as shown in test_setup_sets_variable. If it suppresses exceptions, add a fourth test that verifies the expected exception type does not propagate.
You write a @contextmanager function that calls open(path) to acquire a file handle and yields it. Where should the yield appear?
When to Use a Class Instead
The @contextmanager decorator excels at simple setup/teardown pairs. A class-based context manager is the better choice when:
| Scenario | Use @contextmanager | Use a Class |
|---|---|---|
| Simple setup/teardown pair | Yes | Overkill |
Need to return self from __enter__ | Awkward | Natural |
| Need state accessible after the with block | Requires a separate data object | Store on self |
| Complex __exit__ logic with exc_type checks | Less clear | Explicit parameters |
| Context manager is reusable across multiple with blocks | No — each with use exhausts the generator. As a decorator, a new instance is created per call. | Yes (if designed for it) |
| Want dual use as decorator | Built-in since 3.2 | Inherit from ContextDecorator |
Context managers created with @contextmanager are single-use when entered directly in a with statement. If you try to enter the same instance a second time, the generator is already exhausted and Python raises RuntimeError. When used as a function decorator, however, @contextmanager implicitly creates a fresh generator instance on every call — this is precisely why decorator mode is reusable. Class-based context managers can be designed for reuse by resetting state in __enter__.
Key Takeaways
@contextmanagerconverts a generator function into a context manager. Code beforeyieldis the setup (__enter__). Code afteryieldis the teardown (__exit__). The yielded value becomes theasvariable.- The generator must yield exactly once. Zero yields produces
RuntimeError("generator didn't yield"). More than one yield producesRuntimeError("generator didn't stop"). - Always wrap
yieldintry/finallyfor guaranteed cleanup. Withouttry/finally, an exception in the with block prevents the teardown code from executing. Thefinallyblock ensures cleanup runs regardless. - Exception handling uses standard
try/exceptsyntax. If the generator catches an exception and does not re-raise it, the exception is suppressed. If it re-raises or does not catch it, the exception propagates normally. - Since Python 3.2, the result works as both a context manager and a function decorator. When used as a
@decorator, the entire function body runs inside the context. The yielded value is not accessible in decorator mode. - Use a class when the context manager needs persistent state, reuse, or complex exit logic.
@contextmanagerproduces single-use context managers — entering the same instance a second time in awithstatement raisesRuntimeErrorbecause the generator is exhausted. When used as a decorator, however, a new generator instance is created on each function call, so decorator-mode use is effectively reusable. If you need the same object to be reentrant inwithstatements, write a class. - An exception in the setup code (before the yield) propagates directly — teardown does not run. If setup acquires multiple resources, guard each acquisition with its own
try/exceptto release any partially acquired resources before re-raising. - Compose multiple context managers on one
withline using a comma-separated list, or useExitStackfor a dynamic number. Managers are entered left to right and exited right to left.ExitStackcovers cases where the set of managers is not known until runtime. - Test the three paths through every context manager: setup, clean teardown, and teardown after exception. Use
pytest.raisesto confirm that teardown runs even when thewithblock raises, and verify that any suppression behaviour is intentional.
@contextmanager is one of the Python standard library's most useful abstractions. It turns a pattern that requires a class with two dunder methods into a single generator function with a yield, usable directly in any with statement. For the majority of context management tasks — temporary state changes, resource acquisition/release, timing, logging wrappers — the generator approach is shorter, clearer, and sufficient.