Every Python developer has written with open('file.txt') as f: at some point. Many treat it as a convenient shortcut for file handling. But few understand why this two-word keyword fundamentally changed how Python manages resources — or that it took over two years of heated debate, five competing proposals, and a blog post about C macros to get it into the language.
This article goes from the basics of with open() all the way to writing your own context managers, the PEP history behind the design decisions, and the modern syntax improvements that came over a decade later. Real code, real explanations, real sources.
The Problem with Was Built to Solve
Before Python 2.5, every file operation required manual cleanup. If you opened a file, you were responsible for closing it. And if an exception fired between the open and the close, your file handle leaked.
Here's what "correct" file handling looked like in 2004:
f = open("/etc/passwd", "r")
try:
data = f.read()
process(data)
finally:
f.close()
This works. But it's boilerplate-heavy, and developers frequently got it wrong. A common mistake was putting the open() call inside the try block, which meant f.close() could blow up with a NameError if the file didn't exist. Another was simply forgetting the finally block altogether.
The same pattern repeated everywhere: acquiring a lock, opening a database connection, setting up a temporary directory. Every resource that needed deterministic cleanup required this same try/finally dance.
As Michael Hudson and Paul Moore put it in PEP 310 (created December 18, 2002), Python's exception handling philosophy generally prevented developers from doing the wrong thing — but that principle did not yet extend to resource cleanup. Writing correct cleanup code required more effort than writing incorrect code. (Source: PEP 310, peps.python.org)
This wasn't just an inconvenience. It was a language-level gap between philosophy and practice. Python made it hard to write bad exception-handling code, but made it easy to leak file handles, forget to release locks, and leave database connections dangling. The with statement was designed to close that gap.
How with open() Actually Works
The with statement introduced in Python 2.5 collapses that entire pattern into two lines:
with open("/etc/passwd", "r") as f:
data = f.read()
process(data)
When Python encounters this code, here is what actually executes under the hood:
# What Python does internally
manager = open("/etc/passwd", "r") # Evaluate the expression
f = manager.__enter__() # Call __enter__, bind result to 'f'
try:
data = f.read()
process(data)
except Exception as exc:
# If an exception occurred, pass it to __exit__
if not manager.__exit__(type(exc), exc, exc.__traceback__):
raise # Re-raise if __exit__ returns False/None
else:
manager.__exit__(None, None, None) # Clean exit, no exception
There are two critical details here that trip people up.
First, the variable after as does not receive the result of the expression after with. It receives the return value of __enter__(). For file objects, __enter__() returns self, so the distinction doesn't matter. But for other context managers, the value bound by as can be something entirely different from the context manager object.
Second, __exit__() always runs. Whether the block completes normally, raises an exception, or hits a return, break, or continue — the exit method is guaranteed to execute. This is the entire point: deterministic cleanup without relying on the programmer to remember it.
The Python documentation for compound statements specifies the exact desugaring of the with statement. The pseudocode above is simplified — the actual implementation also handles the case where __exit__ itself raises an exception, and uses additional internal variables to avoid name conflicts.
A Deeper Look: The open() Context Manager
Let's see this protocol in action with a real example that demonstrates the guarantee:
def risky_file_operation(path):
"""Demonstrates that __exit__ runs even when exceptions occur."""
with open(path, "w") as f:
f.write("First line\n")
raise ValueError("Something went wrong mid-write")
f.write("This line never executes\n")
# f is now closed, even though we raised an exception
try:
risky_file_operation("demo.txt")
except ValueError:
pass
# Verify the file was properly closed
with open("demo.txt", "r") as f:
print(f.read()) # Prints: "First line\n"
The file is closed. The data that was written before the exception is flushed. No resource leak. Compare that to the manual approach where a single missing finally clause would leave the file handle dangling.
Here's another common pattern — reading a file and processing it line by line:
word_counts = {}
with open("novel.txt", "r", encoding="utf-8") as f:
for line in f:
for word in line.strip().split():
word_counts[word.lower()] = word_counts.get(word.lower(), 0) + 1
# File is already closed here. word_counts is ready to use.
top_10 = sorted(word_counts.items(), key=lambda x: x[1], reverse=True)[:10]
for word, count in top_10:
print(f"{word}: {count}")
The with block defines a clear scope: "inside this block, the file is open and available. Outside it, the file is closed." This makes resource lifetimes visible in the code structure itself — a concept sometimes called scoped resource management. The resource's lifetime is tied to a lexical scope, not to garbage collection timing.
The PEP History: Five Proposals and a Blog Post
The with statement didn't arrive easily. Understanding its history explains why it works the way it does, and why certain alternative designs were rejected.
PEP 310 — Reliable Acquisition/Release Pairs (December 2002)
Michael Hudson proposed the first version. It was simple: an __enter__ method called before the block, an __exit__ method called after. No exception handling in __exit__, no ability to suppress exceptions. PEP 310 was clean and minimal, but ultimately rejected in favor of the more powerful PEP 343. (Source: PEP 310, peps.python.org)
PEP 340 — Anonymous Block Statements (April 2005)
Guido van Rossum wrote a far more ambitious proposal. PEP 340 combined generators as block templates with exception handling and finalization. It was powerful, but it had a fatal flaw: under the hood, it was a looping construct. The break and continue keywords would operate on the block statement, even when the block was being used as a non-looping resource manager. This made the control flow confusing and unpredictable.
The decisive moment came when Guido read a blog post by Raymond Chen of Microsoft, published January 6, 2005 on The Old New Thing, titled "A rant against flow control macros." Chen argued that concealing flow control inside macros makes code impossible to audit. Guido recognized the same reasoning applied to PEP 340. As he wrote in PEP 343, he found Chen's case persuasive: that hiding flow control makes code hard to follow, and that PEP 340 templates could conceal all kinds of control flow. (Source: PEP 343, peps.python.org; Raymond Chen, devblogs.microsoft.com, January 6, 2005)
PEP 340 was rejected in favor of PEP 343.
PEP 346 — User Defined ("with") Statements (May 2005)
Alyssa (Nick) Coghlan proposed a middle ground, blending the best parts of PEP 310 and PEP 340. But by this point, Guido was already writing PEP 343, and PEP 346 was voluntarily withdrawn. (Source: PEP 346, peps.python.org)
PEP 343 — The "with" Statement (May 2005)
This is the one that shipped. Authored by Guido van Rossum and later updated by Alyssa Coghlan, PEP 343 took PEP 310's simplicity and added exception-aware cleanup. The key innovation was that __exit__() receives exception information, allowing context managers to handle (or suppress) exceptions if appropriate.
The PEP went through extensive discussion on python-dev. One notable debate was about what exception to raise when a generator-based context manager misbehaves. Guido explained his choice of RuntimeError: he preferred using an existing exception that would appear in a traceback for the developer to fix, rather than creating a new exception class that someone might try to catch. (Source: PEP 343, "Resolved Issues" section, peps.python.org)
PEP 343 was implemented by Mike Bland, Guido van Rossum, and Neal Norwitz. It shipped in Python 2.5 (September 2006), initially behind a from __future__ import with_statement directive, becoming a standard keyword in Python 2.6. (Source: "What's New in Python 2.5," docs.python.org)
PEP 319 — Python Synchronize/Asynchronize Block
Authored by Michel Pelletier and also rejected in favor of PEP 343. Its use cases for synchronization and asynchronization could be handled by providing appropriate context managers — for example, using a locking context manager for synchronized blocks. (Source: PEP 319, peps.python.org)
Reading PEP 343's motivation section alongside PEP 310 and PEP 340 is one of the best ways to understand why Python's design decisions end up the way they do. The rejected proposals reveal how language designers weigh simplicity against power, and why "good enough" sometimes beats "everything."
Writing Your Own Context Managers
Understanding the protocol means you can build your own. There are two approaches: class-based and generator-based.
Class-Based Context Managers
A context manager is any object with __enter__() and __exit__() methods:
import time
class Timer:
"""A context manager that measures execution time of a code block."""
def __enter__(self):
self.start = time.perf_counter()
return self # This is what gets bound by 'as'
def __exit__(self, exc_type, exc_val, exc_tb):
self.elapsed = time.perf_counter() - self.start
print(f"Block executed in {self.elapsed:.4f} seconds")
return False # Don't suppress exceptions
# Usage
with Timer() as t:
total = sum(range(10_000_000))
print(f"Result: {total}, took {t.elapsed:.4f}s")
The __exit__ method signature is important. It receives three arguments describing any exception that occurred: the exception type, the exception value, and the traceback. If no exception occurred, all three are None. Returning a truthy value from __exit__ suppresses the exception; returning False or None lets it propagate.
Here's a more practical example — a database transaction manager:
import sqlite3
class Transaction:
"""Context manager for SQLite transactions with automatic rollback."""
def __init__(self, db_path):
self.db_path = db_path
self.conn = None
def __enter__(self):
self.conn = sqlite3.connect(self.db_path)
self.conn.execute("BEGIN")
return self.conn
def __exit__(self, exc_type, exc_val, exc_tb):
if exc_type is None:
self.conn.commit()
else:
self.conn.rollback()
self.conn.close()
return False # Always propagate exceptions
# Usage: either everything commits, or nothing does
with Transaction("app.db") as conn:
conn.execute("INSERT INTO users (name) VALUES (?)", ("Alice",))
conn.execute("INSERT INTO users (name) VALUES (?)", ("Bob",))
# If any INSERT fails, both are rolled back
Generator-Based Context Managers with contextlib
Python's contextlib module provides a decorator that turns a generator function into a context manager. Everything before the yield is the setup (__enter__), and everything after is the teardown (__exit__):
from contextlib import contextmanager
import os
@contextmanager
def working_directory(path):
"""Temporarily change the working directory."""
original = os.getcwd()
try:
os.chdir(path)
yield path
finally:
os.chdir(original)
# Usage
print(f"Before: {os.getcwd()}")
with working_directory("/tmp") as p:
print(f"Inside: {os.getcwd()}")
# Do work in /tmp...
print(f"After: {os.getcwd()}") # Back to original
The try/finally around the yield is critical. Without it, if the body of the with block raises an exception, the cleanup code after yield won't run. This is one of the more common bugs in hand-written generator-based context managers.
Here's a practical example that suppresses specific exceptions — something you'd normally need a full class to implement:
from contextlib import contextmanager
@contextmanager
def suppress_and_log(*exceptions):
"""Suppress specific exceptions and log them instead of crashing."""
try:
yield
except exceptions as e:
print(f"[SUPPRESSED] {type(e).__name__}: {e}")
# Usage
with suppress_and_log(FileNotFoundError, PermissionError):
data = open("nonexistent_file.txt").read()
print("Execution continues normally")
# Output:
# [SUPPRESSED] FileNotFoundError: [Errno 2] No such file or directory: 'nonexistent_file.txt'
# Execution continues normally
For reference, the standard library already ships contextlib.suppress for the basic case, but this extended version shows how generator-based context managers can implement arbitrary exception handling logic.
Using ContextDecorator for Dual-Use Managers
Python 3.2 introduced contextlib.ContextDecorator, which lets a context manager double as a function decorator. This is a pattern that many developers overlook but is tremendously useful for cross-cutting concerns like logging and timing:
from contextlib import ContextDecorator
import time
class track_performance(ContextDecorator):
"""Use as a context manager OR a decorator to track execution time."""
def __init__(self, label="block"):
self.label = label
def __enter__(self):
self.start = time.perf_counter()
return self
def __exit__(self, *exc):
elapsed = time.perf_counter() - self.start
print(f"[PERF] {self.label}: {elapsed:.4f}s")
return False
# As a context manager:
with track_performance("data load"):
data = list(range(1_000_000))
# As a decorator:
@track_performance("sort operation")
def sort_data(data):
return sorted(data)
sort_data(data)
The ContextDecorator base class handles the wrapping logic automatically. When used as a decorator, it creates a new context manager instance for each function call, so state is isolated between invocations. (Source: contextlib documentation, docs.python.org)
When to Build a Context Manager (Decision Framework)
Knowing how to write context managers is only half the battle. Knowing when to reach for one is where engineering judgment comes in. Here is a decision framework:
Build a context manager when: you have a resource with paired setup/teardown operations, and forgetting the teardown would cause a bug, a leak, or data corruption. The classic examples are file handles, locks, database transactions, and network connections. But the principle extends further: any time you find yourself writing try/finally to ensure cleanup, that's a context manager waiting to happen.
Consider a context manager when: you want to temporarily change state and guarantee restoration. Changing the working directory, modifying environment variables, setting decimal precision, redirecting stdout, or patching objects in tests — these are all "temporary state change" patterns that fit the protocol perfectly.
Don't force a context manager when: there's no meaningful teardown. If you're just constructing an object that doesn't need cleanup, a plain function or constructor is simpler and more readable. Not every resource needs a with block.
A useful heuristic: if you can describe the operation as "do X, let the caller work, then undo X no matter what," it's a context manager. If you can describe it as "do X," it's probably just a function.
Beyond Files: Where Context Managers Shine
The with statement isn't just for files. The standard library is full of context managers, and the pattern works for any resource that needs setup and teardown:
import threading
import tempfile
import decimal
# Thread locks
lock = threading.Lock()
with lock:
# Critical section - lock is held
shared_resource.update(new_data)
# Lock is released
# Temporary files that auto-delete
with tempfile.NamedTemporaryFile(mode="w", suffix=".csv", delete=True) as tmp:
tmp.write("header1,header2\n")
tmp.write("value1,value2\n")
tmp.flush()
# Process tmp.name...
# File is deleted
# Decimal precision contexts
with decimal.localcontext() as ctx:
ctx.prec = 50
result = decimal.Decimal(1) / decimal.Decimal(7)
print(result) # 50 digits of precision
# Original precision is restored
What ties all of these together is a single idea: the with statement makes resource lifetimes visible in the structure of the code. You don't have to trace through function calls or read documentation to understand when a lock is released or a file is deleted. The indentation tells you.
Nesting Multiple Context Managers
A common need is managing multiple resources simultaneously. Python has evolved its syntax for this over time.
The old way (pre-3.1) — nested statements:
with open("input.txt", "r") as src:
with open("output.txt", "w") as dst:
for line in src:
dst.write(line.upper())
Python 3.1+ — comma-separated:
with open("input.txt", "r") as src, open("output.txt", "w") as dst:
for line in src:
dst.write(line.upper())
Python 3.10+ — parenthesized context managers:
This is a significant quality-of-life improvement that became possible thanks to PEP 617's new PEG parser for CPython. The old LL(1) parser couldn't handle parenthesized with statements because the opening parenthesis was ambiguous — it could be the start of a grouping or the start of an expression being passed to with. As PEP 617 documented, the issue (tracked as bpo-12782) was not solvable under LL(1) parsing constraints. (Source: PEP 617, peps.python.org)
The new PEG parser, contributed by Guido van Rossum, Pablo Galindo, and Lysandros Nikolaou, removed this limitation. Starting in Python 3.10:
with (
open("input.txt", "r") as src,
open("output.txt", "w") as dst,
open("log.txt", "a") as log,
):
for line in src:
processed = line.strip().upper()
dst.write(processed + "\n")
log.write(f"Processed: {processed[:50]}...\n")
This is especially valuable in testing, where mocking multiple dependencies was previously awkward:
from unittest import mock
# Before Python 3.10: backslash continuation or deep nesting
with mock.patch("myapp.db.connect") as mock_db, \
mock.patch("myapp.cache.get") as mock_cache, \
mock.patch("myapp.email.send") as mock_email:
run_tests()
# Python 3.10+: clean parenthesized grouping
with (
mock.patch("myapp.db.connect") as mock_db,
mock.patch("myapp.cache.get") as mock_cache,
mock.patch("myapp.email.send") as mock_email,
):
run_tests()
The trailing comma is also supported in parenthesized with statements, matching the convention used in imports, function arguments, and collection literals.
async with: Context Managers for Asynchronous Code
PEP 492 (accepted May 5, 2015, authored by Yury Selivanov) extended the context manager protocol to asynchronous code. Where synchronous context managers implement __enter__ and __exit__, asynchronous context managers implement __aenter__ and __aexit__ — both of which are coroutines that can be awaited. (Source: PEP 492, peps.python.org)
import asyncio
from contextlib import asynccontextmanager
@asynccontextmanager
async def async_db_connection(dsn):
"""Async context manager for database connections."""
print(f"Connecting to {dsn}...")
conn = await asyncio.sleep(0.1) # Simulating async connect
try:
yield {"connection": dsn, "status": "open"}
finally:
print(f"Closing connection to {dsn}...")
await asyncio.sleep(0.05) # Simulating async cleanup
async def main():
async with async_db_connection("postgresql://localhost/mydb") as conn:
print(f"Using connection: {conn}")
# Do async database work...
asyncio.run(main())
The asynccontextmanager decorator was added to contextlib in Python 3.7. Since Python 3.10, async context managers created with this decorator can also be used as decorators themselves. (Source: contextlib documentation, docs.python.org)
PEP 806 (Draft, authored by Zac Hatfield-Dodds, created September 5, 2025) proposes allowing mixed sync and async context managers in a single with statement. Individual entries could be marked with async while others remain synchronous, eliminating the deep nesting that currently results from combining synchronous locks or files with asynchronous database connections. (Source: PEP 806, peps.python.org)
contextlib Power Tools
The contextlib module has grown well beyond @contextmanager. Here are tools worth knowing:
from contextlib import ExitStack, suppress, redirect_stdout, closing, nullcontext
import io
# ExitStack: manage a dynamic number of context managers
def process_files(file_paths):
with ExitStack() as stack:
files = [stack.enter_context(open(fp)) for fp in file_paths]
# All files are open; all will be closed when the block exits
for f in files:
print(f.readline())
# suppress: cleanly ignore expected exceptions
import os
with suppress(FileNotFoundError):
os.remove("might_not_exist.tmp")
# No try/except needed
# redirect_stdout: capture printed output
f = io.StringIO()
with redirect_stdout(f):
print("This goes to the StringIO buffer, not the console")
captured = f.getvalue()
# closing: add context manager protocol to objects that have .close()
from urllib.request import urlopen
with closing(urlopen("https://www.python.org")) as page:
html = page.read()
# nullcontext: a no-op context manager for conditional resource use
def process_data(file_or_path):
if isinstance(file_or_path, str):
cm = open(file_or_path)
else:
cm = nullcontext(file_or_path) # Already a file object
with cm as f:
return f.read()
ExitStack is particularly powerful for cases where you don't know at write-time how many resources you'll need to manage. It handles cleanup in LIFO (last in, first out) order, just as nested with statements would.
nullcontext (added in Python 3.7) is the context manager equivalent of a no-op: it enters and exits without doing anything. This sounds trivial, but it's extremely useful for writing functions that optionally manage a resource. Instead of branching your logic with if/else around two different with blocks, you can select the context manager conditionally and use a single with statement. It also works as a stand-in for async context managers. (Source: contextlib documentation, docs.python.org)
Edge Cases and Defensive Patterns
Context managers handle the common cases elegantly, but the edges reveal important details that can save you from subtle bugs.
What Happens When __enter__ Fails?
If __enter__ raises an exception, __exit__ is not called. This is important: the context manager protocol only guarantees cleanup if __enter__ succeeds. For open(), this means if the file doesn't exist, no file handle is created, and no cleanup is needed. But for custom context managers that acquire multiple resources in __enter__, a failure partway through can leak the ones that were already acquired.
from contextlib import ExitStack
class MultiResourceManager:
"""Safely acquire multiple resources, cleaning up on partial failure."""
def __init__(self, paths):
self.paths = paths
def __enter__(self):
self._stack = ExitStack()
try:
self._stack.__enter__()
self.files = [
self._stack.enter_context(open(p)) for p in self.paths
]
return self.files
except Exception:
self._stack.__exit__(None, None, None)
raise
def __exit__(self, *exc):
return self._stack.__exit__(*exc)
# If "missing.txt" doesn't exist, any files that WERE opened
# get cleaned up properly before the exception propagates
try:
with MultiResourceManager(["exists.txt", "missing.txt"]) as files:
pass
except FileNotFoundError:
print("Partial failure handled cleanly")
This pattern — using ExitStack internally to manage partial acquisition — is recommended in the Python documentation for exactly this scenario.
Reusable vs. Single-Use Context Managers
A subtle distinction: context managers created with @contextmanager are single-use by default. Once the generator is exhausted, you can't reuse the same instance. Class-based context managers can be designed for reuse if __enter__ and __exit__ properly reset state:
from contextlib import contextmanager
@contextmanager
def single_use():
print("enter")
yield
print("exit")
cm = single_use()
with cm:
pass # Works fine
# with cm: # RuntimeError: generator didn't yield
# pass
# Class-based: can be designed for reuse
class ReusableTimer:
def __enter__(self):
import time
self.start = time.perf_counter()
return self
def __exit__(self, *exc):
import time
self.elapsed = time.perf_counter() - self.start
return False
timer = ReusableTimer()
with timer:
pass # First use
with timer:
pass # Second use - works fine
When using @contextmanager as a decorator (available since Python 3.2), this single-use limitation is handled automatically — each function invocation creates a fresh generator instance.
Common Mistakes and Misconceptions
Mistake 1: Using the file outside the with block.
with open("data.txt") as f:
pass
# f still exists as a variable, but the file is closed
f.read() # Raises ValueError: I/O operation on closed file
The variable f remains in scope after the with block, but the underlying file handle is closed. This is a source of subtle bugs, particularly when the with block is long and the later code is far from the closing brace.
Mistake 2: Assuming as is required.
# The 'as' clause is optional --- this is perfectly valid
with open("log.txt", "a"):
pass # Maybe you just want to ensure the file exists
# Common real-world example without 'as'
with suppress(FileNotFoundError):
os.remove("temp.txt")
Mistake 3: Thinking with replaces all exception handling.
The with statement guarantees cleanup, not error handling. If you need to catch and handle specific exceptions from the block body, you still need try/except — either inside or outside the with:
with open("config.json") as f:
try:
config = json.load(f)
except json.JSONDecodeError as e:
print(f"Invalid JSON: {e}")
config = {}
Mistake 4: Suppressing exceptions unintentionally.
If __exit__ returns True (or any truthy value), the exception is swallowed. This is almost never what you want unless you're building something like contextlib.suppress. Always return False or None from __exit__ unless you have a specific reason to suppress exceptions — and document that reason clearly.
Mistake 5: Confusing with and garbage collection.
In CPython, objects are garbage-collected via reference counting, and file objects close themselves in __del__. This can create the illusion that with isn't necessary. But relying on garbage collection for cleanup is fragile: circular references can delay collection, other Python implementations (PyPy, Jython) use different GC strategies, and __del__ timing is never guaranteed. The with statement gives you deterministic cleanup, regardless of the runtime.
Quick Reference: PEPs That Shaped the with Statement
| PEP | Title | Author(s) | Status | Contribution |
|---|---|---|---|---|
| 310 | Reliable Acquisition/Release Pairs | Michael Hudson, Paul Moore | Rejected | Original with proposal (2002) |
| 319 | Python Synchronize/Asynchronize Block | Michel Pelletier | Rejected | Sync/async blocks; subsumed by PEP 343 |
| 340 | Anonymous Block Statements | Guido van Rossum | Rejected | Ambitious but too complex; looping semantics |
| 343 | The "with" Statement | Guido van Rossum, Alyssa Coghlan | Final | The accepted design (Python 2.5, 2006) |
| 346 | User Defined ("with") Statements | Alyssa Coghlan | Withdrawn | Middle ground between 310 and 340 |
| 492 | Coroutines with async and await | Yury Selivanov | Final | Added async with (Python 3.5, 2015) |
| 617 | New PEG Parser for CPython | Guido van Rossum, Pablo Galindo, Lysandros Nikolaou | Final | Enabled parenthesized context managers (Python 3.10) |
| 806 | Mixed sync/async context managers with precise async marking | Zac Hatfield-Dodds | Draft | Proposed per-item async marking in with |
Final Thoughts
The with statement is one of Python's best-designed features. It took years of proposals, rejections, and refinement to get right, and the result is a construct that makes correct resource management the path of least resistance rather than something you have to fight for.
What makes the with statement powerful isn't just what it does — it's how it changes the way you think about code. Without it, resource management is an afterthought, something you bolt on with try/finally after the fact. With it, resource lifetimes become part of your code's structure. The indentation itself communicates intent: "this block owns this resource."
That shift in thinking — from manual cleanup to structural ownership — is the real lesson. It echoes a principle that shows up in language design over and over: make the correct thing easy to write, and the incorrect thing hard to write without noticing.
If you take one thing from this article, let it be this: with open() isn't just a shortcut for opening files. It's a protocol. Any object that implements __enter__ and __exit__ can participate in it, and understanding that protocol gives you the tools to write cleaner, safer code for any resource that needs deterministic cleanup.
Start building your own context managers. Once you internalize the pattern, you'll find uses for it everywhere — timing blocks, database transactions, temporary state changes, test fixtures, connection pools, conditional resource acquisition. The with statement is the Pythonic way to say: "set this up, let me work, and clean it up no matter what happens."