Examples of Good Python Code to Learn From

Reading good code is one of the fastest ways to become a better programmer. Tutorials teach you syntax. Projects teach you problem-solving. But studying well-written code from experienced developers teaches you craft—the difference between code that merely works and code that is clean, maintainable, and a genuine pleasure to read. This article walks through concrete examples of high-quality Python code, drawn from the standard library, popular open-source projects, and real-world patterns that professionals rely on daily.

There is a well-known principle in Python culture that comes straight from PEP 20, the Zen of Python: "Readability counts." That is not just a platitude. It shapes how the language was designed, how the standard library was written, and how the broader community evaluates code quality. Good Python code expresses its intent clearly, uses the language's built-in features instead of fighting against them, and treats the next person who reads it—including your future self—as someone who deserves clarity.

The examples below are not hypothetical. They come from real projects, real patterns, and real decisions that practicing developers make every day. Study them, adapt them, and let them reshape the way you think about writing Python.

What Makes Python Code "Good"

Before looking at specific examples, it helps to establish what separates good Python code from code that simply runs without errors. The Python community has converged on several principles over the years, formalized in PEP 8 (the style guide) and PEP 20 (the Zen of Python), and reinforced through decades of open-source collaboration.

Good Python code is readable, meaning a new developer can understand what a function does within seconds. It is explicit, meaning it does not hide behavior behind clever tricks or obscure one-liners. It is modular, meaning each function or class handles one responsibility well. And it is idiomatic, meaning it uses Python's built-in tools—list comprehensions, context managers, generators, unpacking—rather than porting patterns from other languages.

Note

PEP 8 is the official Python style guide, covering naming conventions, indentation, line length, and formatting. PEP 20 contains the Zen of Python, a collection of 19 guiding principles for writing Python. You can view the Zen at any time by running import this in a Python interpreter.

Here is a quick example of the difference between code that works and code that communicates clearly:

# Unclear: what does this do?
d = {k: v for k, v in sorted(x.items(), key=lambda i: i[1], reverse=True)[:5]}

# Clear: intent is obvious at a glance
def top_items(scores, limit=5):
    """Return the highest-scoring items as a dictionary."""
    sorted_pairs = sorted(scores.items(), key=lambda pair: pair[1], reverse=True)
    return dict(sorted_pairs[:limit])

Both versions produce identical output. But the second version names the operation, documents its purpose, and breaks the logic into readable steps. A teammate can understand it instantly. That is the standard to aim for.

The Standard Library: Python's Built-In Masterclass

The Python standard library is one of the best places to study well-written Python. It is maintained by core developers, reviewed extensively, and written to serve as a reference for how the language should be used. Several modules stand out as particularly instructive.

pathlib: Object-Oriented File Paths

The pathlib module, introduced in Python 3.4, replaced the older os.path approach with an object-oriented interface. It is an excellent example of how good API design can make code more readable and less error-prone.

from pathlib import Path

# Instead of messy string concatenation with os.path.join
# pathlib gives you clean, chainable operations

project_root = Path("/home/user/project")
config_file = project_root / "config" / "settings.json"

# Check existence, read content, and handle paths naturally
if config_file.exists():
    content = config_file.read_text(encoding="utf-8")
    print(f"Loaded config from {config_file.name}")

# Iterate over all Python files in a directory tree
python_files = list(project_root.rglob("*.py"))
print(f"Found {len(python_files)} Python files")

Notice how pathlib uses the division operator / to join paths. This is not a gimmick. It is a deliberate design choice that makes path construction read like a natural directory structure. The module also consolidates operations that used to require importing from multiple places—os.path, os, glob, and shutil—into a single coherent interface.

collections: Purpose-Built Data Structures

The collections module is a masterclass in providing the right tool for the job. Rather than forcing developers to hack together behavior using basic dictionaries and lists, it offers specialized containers that handle common patterns cleanly.

from collections import Counter, defaultdict, namedtuple

# Counter: count occurrences without manual loops
words = ["apple", "banana", "apple", "cherry", "banana", "apple"]
word_counts = Counter(words)
print(word_counts.most_common(2))  # [('apple', 3), ('banana', 2)]

# defaultdict: eliminate key-existence checks
groups = defaultdict(list)
students = [("math", "Alice"), ("science", "Bob"), ("math", "Charlie")]
for subject, name in students:
    groups[subject].append(name)
# No need for: if subject not in groups: groups[subject] = []

# namedtuple: self-documenting data containers
Point = namedtuple("Point", ["x", "y"])
origin = Point(x=0, y=0)
print(f"Origin is at ({origin.x}, {origin.y})")
Pro Tip

In modern Python (3.7+), dataclasses often replaces namedtuple for more complex data containers. Dataclasses provide mutable fields, default values, and a cleaner syntax. Use namedtuple when you need an immutable, lightweight record. Use dataclasses when you need flexibility.

contextlib: Clean Resource Management

The contextlib module demonstrates how Python handles resource management elegantly. The contextmanager decorator lets you write context managers as simple generator functions instead of full classes with __enter__ and __exit__ methods.

from contextlib import contextmanager
import time

@contextmanager
def timer(label):
    """Measure and print execution time for a block of code."""
    start = time.perf_counter()
    try:
        yield
    finally:
        elapsed = time.perf_counter() - start
        print(f"{label}: {elapsed:.4f} seconds")

# Usage is clean and readable
with timer("Data processing"):
    results = [x ** 2 for x in range(1_000_000)]

This pattern shows up everywhere in professional Python code. Database connections, file locks, temporary directories, and API sessions all benefit from context managers. The contextmanager decorator keeps the implementation short while ensuring cleanup always runs, even when exceptions occur.

Open-Source Projects Worth Studying

Beyond the standard library, several open-source projects are widely recognized for their code quality. These are not just useful tools—they are educational resources for anyone who wants to write better Python.

Requests: HTTP for Humans

The requests library, created by Kenneth Reitz, became famous partly because of how clean its API is. It prioritized developer experience above all else, and its source code reflects that philosophy. Consider how it handles a common task—sending an authenticated request with custom headers:

import requests

response = requests.get(
    "https://api.example.com/data",
    headers={"Accept": "application/json"},
    auth=("username", "api_key"),
    timeout=30,
)
response.raise_for_status()
data = response.json()

Every parameter name reads like plain English. There is no ambiguity about what auth, headers, or timeout do. The raise_for_status() method replaces verbose status-code checking with a single clear call. This design philosophy—making the common case easy and the complex case possible—is worth internalizing.

FastAPI: Type Hints as Documentation

FastAPI, created by Sebastian Ramirez, demonstrates how Python's type hint system can do real work beyond static analysis. Type annotations drive automatic validation, serialization, and interactive API documentation.

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class UserCreate(BaseModel):
    username: str
    email: str
    age: int | None = None

@app.post("/users/")
async def create_user(user: UserCreate):
    # FastAPI automatically:
    # - Validates the request body against the model
    # - Returns 422 errors for invalid data
    # - Generates OpenAPI documentation
    return {"username": user.username, "status": "created"}

The code is remarkably concise, yet it handles input validation, error responses, and documentation generation without a single line of boilerplate. Studying FastAPI's source teaches you how decorators, type hints, and dependency injection can work together to eliminate repetitive code.

Rich: Beautiful Terminal Output

The rich library, created by Will McGuigan, is worth studying for its use of Python's protocol system. It implements __rich_console__ and __rich_repr__ protocols, allowing any object to define how it should be displayed. The library's own code is also exceptionally well-organized and documented.

from rich.console import Console
from rich.table import Table

console = Console()

# Build a formatted table with minimal code
table = Table(title="Server Status")
table.add_column("Host", style="cyan")
table.add_column("Status", style="green")
table.add_column("Latency", justify="right")

table.add_row("web-01", "Running", "12ms")
table.add_row("web-02", "Running", "8ms")
table.add_row("db-01", "Degraded", "145ms")

console.print(table)

The pattern here—a builder-style API where you construct objects step by step—is clear and composable. Each method call adds one piece of information, and the final print renders everything. This is far more maintainable than assembling format strings by hand.

Patterns From Production Codebases

Beyond specific libraries, there are coding patterns that show up repeatedly in well-maintained production code. These patterns are not flashy. They are practical, battle-tested approaches to problems that every Python developer eventually encounters.

Dataclasses for Structured Data

from dataclasses import dataclass, field
from datetime import datetime

@dataclass
class LogEntry:
    timestamp: datetime
    level: str
    message: str
    source: str = "unknown"
    tags: list[str] = field(default_factory=list)

    @property
    def is_error(self) -> bool:
        return self.level in ("ERROR", "CRITICAL")

    def __str__(self) -> str:
        return f"[{self.timestamp:%H:%M:%S}] {self.level}: {self.message}"

# Clean instantiation with named parameters
entry = LogEntry(
    timestamp=datetime.now(),
    level="ERROR",
    message="Connection refused",
    source="database",
    tags=["db", "connectivity"],
)
print(entry)  # [14:32:07] ERROR: Connection refused

This pattern eliminates the need to write repetitive __init__, __repr__, and __eq__ methods by hand. The @dataclass decorator generates them automatically. The field(default_factory=list) call avoids the classic mutable-default-argument bug. And the @property decorator adds computed attributes without changing the interface.

Generators for Memory-Efficient Processing

def read_large_file(filepath, chunk_size=8192):
    """Read a large file in chunks without loading it entirely into memory."""
    with open(filepath, "rb") as f:
        while True:
            chunk = f.read(chunk_size)
            if not chunk:
                break
            yield chunk

def find_matching_lines(filepath, pattern):
    """Yield lines from a text file that contain a given pattern."""
    with open(filepath, "r", encoding="utf-8") as f:
        for line_number, line in enumerate(f, start=1):
            if pattern in line:
                yield line_number, line.rstrip()

# Process results lazily: no need to store everything in a list
for num, line in find_matching_lines("server.log", "ERROR"):
    print(f"Line {num}: {line}")

Generators are one of Python's most powerful features for writing memory-efficient code. Instead of building a complete list and returning it, a generator yields one result at a time. The caller processes each item as it arrives, which means you can handle files that are gigabytes in size without running out of memory. This pattern is standard practice in data pipelines, log analysis, and stream processing.

Enums for Meaningful Constants

from enum import Enum, auto

class TaskStatus(Enum):
    PENDING = auto()
    IN_PROGRESS = auto()
    COMPLETED = auto()
    FAILED = auto()

    @property
    def is_terminal(self) -> bool:
        return self in (TaskStatus.COMPLETED, TaskStatus.FAILED)

def process_task(status: TaskStatus) -> str:
    match status:
        case TaskStatus.PENDING:
            return "Waiting to start"
        case TaskStatus.IN_PROGRESS:
            return "Currently running"
        case TaskStatus.COMPLETED:
            return "Finished successfully"
        case TaskStatus.FAILED:
            return "Encountered an error"

current = TaskStatus.IN_PROGRESS
print(process_task(current))       # Currently running
print(current.is_terminal)         # False
Note

The match statement (structural pattern matching) was introduced in Python 3.10. It pairs naturally with enums for clean branching logic. If you are supporting older Python versions, a dictionary dispatch or if/elif chain achieves the same result.

Enums prevent an entire category of bugs that come from using raw strings or integers as status codes. When you use TaskStatus.COMPLETED instead of the string "completed", your IDE can autocomplete it, your type checker can verify it, and a typo becomes a clear error instead of a silent bug that surfaces in production.

Before and After: Refactoring Toward Clarity

Seeing the finished product is useful, but seeing the transformation from rough code to clean code can be even more instructive. Here are three refactoring examples that illustrate common improvements.

Replace Manual Loops With Comprehensions

# Before: imperative loop with manual append
filtered = []
for user in users:
    if user.is_active and user.age >= 18:
        filtered.append(user.name)

# After: list comprehension makes intent clear
active_adult_names = [
    user.name
    for user in users
    if user.is_active and user.age >= 18
]

The comprehension version is not just shorter. It communicates that the operation is a filtered transformation—take a collection, apply a condition, and extract a value. That pattern is immediately recognizable to experienced Python developers. When the filtering logic gets more complex, though, move it into a separate function rather than cramming everything into one comprehension.

Replace Nested Try/Except With Early Returns

# Before: deeply nested validation
def process_order(order):
    try:
        if order is not None:
            if order.items:
                if order.customer.has_valid_payment():
                    total = sum(item.price for item in order.items)
                    return {"status": "success", "total": total}
                else:
                    return {"status": "error", "reason": "Invalid payment"}
            else:
                return {"status": "error", "reason": "No items"}
        else:
            return {"status": "error", "reason": "No order"}
    except Exception as e:
        return {"status": "error", "reason": str(e)}

# After: guard clauses with early returns
def process_order(order):
    if order is None:
        return {"status": "error", "reason": "No order"}

    if not order.items:
        return {"status": "error", "reason": "No items"}

    if not order.customer.has_valid_payment():
        return {"status": "error", "reason": "Invalid payment"}

    total = sum(item.price for item in order.items)
    return {"status": "success", "total": total}

The refactored version reads from top to bottom. Each guard clause handles one failure case and exits early. The "happy path"—the successful outcome—lives at the end of the function, unindented and prominent. This flat structure is far easier to debug, test, and extend with new validation rules.

Replace String Formatting Patchwork With F-Strings

# Before: mixed formatting approaches
message = "User %s logged in from %s at %s" % (username, ip_address, timestamp)
log_line = "Status: {} | Duration: {:.2f}s".format(status, duration)
path = base_dir + "/" + subdir + "/" + filename

# After: consistent f-strings and pathlib
message = f"User {username} logged in from {ip_address} at {timestamp}"
log_line = f"Status: {status} | Duration: {duration:.2f}s"
path = Path(base_dir) / subdir / filename

F-strings, introduced in Python 3.6, are now the standard approach for string formatting. They are faster than .format(), more readable than % formatting, and they keep the variable right next to where it appears in the string. Consistency matters—mixing three different formatting styles in one codebase makes reading harder than necessary.

Key Takeaways

  1. Read the standard library: Modules like pathlib, collections, and contextlib are written by Python's core developers and serve as a reference for idiomatic code. Studying them teaches patterns you can apply immediately.
  2. Study popular open-source projects: Libraries like requests, FastAPI, and rich are celebrated not just for what they do but for how their code is written. Cloning their repositories and reading through the source is one of the most effective ways to level up.
  3. Use the right data structure: Python provides dataclasses, namedtuple, Enum, Counter, defaultdict, and other purpose-built tools. Using them instead of raw dictionaries and strings prevents entire categories of bugs and makes your intent visible.
  4. Favor flat over nested: Guard clauses with early returns, list comprehensions, and generator expressions all reduce nesting and make the flow of your code easier to follow.
  5. Be consistent: Pick one formatting style, one naming convention, and one error-handling approach. Consistency within a codebase matters more than any individual style choice.

The path to writing great Python code is not about memorizing rules. It is about building an instinct for clarity by exposing yourself to well-crafted examples. Read code from developers you admire. Contribute to open-source projects. Refactor your own old code with fresh eyes. Every time you make a piece of code easier for someone else to understand, you are writing good Python.

back to articles