Python Build Control

Every Python project eventually needs to be shared, installed, or deployed. The build system is what makes that possible. It transforms your source code and metadata into distributable packages that other developers can install with pip install. Understanding how Python build control works gives you full command over how your project is packaged, versioned, and delivered to the world.

When you write a Python script or library, the code sitting in your local directory is only useful to you. The moment you want to share it with a colleague, upload it to PyPI, or deploy it to a production server, you need a structured way to package it. That structured process is your build system, and Python has evolved significantly in how it handles it.

For years, setup.py was the centerpiece of Python packaging. It worked, but it was messy. Because setup.py is an executable Python script, building a package required running arbitrary code before you even knew what dependencies were needed. This created a chicken-and-egg problem: you needed to install build tools to read the file that told you which build tools to install.

Modern Python packaging solves this with a declarative, standards-based approach centered on the pyproject.toml file and a clear separation between build frontends and backends.

What Is a Build System?

A build system in Python is the machinery that takes your source code, metadata, and configuration and produces installable distribution files. These distribution files can then be uploaded to a package index like PyPI or installed directly by other developers.

The build system handles several responsibilities: it discovers which Python files belong in the package, it collects and writes metadata like the package name, version, and dependencies, it creates the distribution archive files, and it manages any compilation steps if the project includes C or Rust extensions.

Two standards define how modern build systems operate. PEP 518 introduced the pyproject.toml file and the [build-system] table so that build tools know what they need before they start working. PEP 517 then defined a standard interface that separates the tool you run (the frontend) from the engine that actually constructs the package (the backend).

Note

If your project does not include a [build-system] table in pyproject.toml, tools like pip will fall back to a legacy setuptools behavior. While this still works, it is strongly recommended to always declare your build system explicitly.

The pyproject.toml File

The pyproject.toml file is the single source of truth for your Python project's build configuration, metadata, and tool settings. Written in TOML (Tom's Obvious, Minimal Language), it replaces the older combination of setup.py, setup.cfg, and MANIFEST.in for many projects.

The file is organized into three primary tables:

[build-system] declares the build backend and its dependencies. This tells any frontend tool exactly what it needs to install before it can build your package.

[project] contains your project's metadata, defined by the PEP 621 standard. This includes the package name, version, description, author information, dependencies, Python version requirements, and entry points for command-line scripts.

[tool] holds configuration for various development tools. Each tool gets its own sub-table, such as [tool.ruff] for the Ruff linter or [tool.pytest.ini_options] for pytest settings.

Here is a complete, real-world example of a pyproject.toml file:

[build-system]
requires = ["hatchling>=1.27"]
build-backend = "hatchling.build"

[project]
name = "my-package"
version = "1.0.0"
description = "A utility library for data processing"
readme = "README.md"
requires-python = ">=3.10"
license = "MIT"
authors = [
    { name = "Your Name", email = "you@example.com" }
]
dependencies = [
    "requests>=2.31.0",
    "pydantic>=2.0",
]

[project.optional-dependencies]
dev = ["pytest>=8.0", "ruff>=0.4"]

[project.scripts]
my-tool = "my_package.cli:main"

[project.urls]
Homepage = "https://github.com/you/my-package"
Issues = "https://github.com/you/my-package/issues"

The requires key under [build-system] lists the packages needed to build the project. The build-backend key identifies the Python object that the frontend will call to produce distribution files. Together, these two fields give the frontend everything it needs to set up an isolated build environment and run the build.

Pro Tip

The [project.scripts] section is how you create command-line tools. When someone installs your package, pip will generate an executable that calls the specified function. In the example above, running my-tool in the terminal calls the main() function from my_package/cli.py.

Frontends vs. Backends

The PEP 517 standard creates a clean division between two types of tools. Understanding this separation is key to understanding how Python builds work.

A build frontend is the command-line tool you interact with. It reads the [build-system] table from your pyproject.toml, creates a fresh virtual environment, installs the declared backend into that environment, and then calls the backend to produce distribution files. Common frontends include pip, uv build, python -m build, hatch build, and poetry build.

A build backend is the engine that does the actual work of constructing wheel and sdist files. It implements a standard set of functions that any frontend can call. Common backends include setuptools, hatchling, flit-core, poetry-core, uv_build, scikit-build-core, and maturin.

This separation means you can mix and match frontends and backends. You could use uv build as your frontend while using hatchling as your backend. Or you could use pip wheel with flit-core behind the scenes. The frontend handles environment management and orchestration while the backend handles the actual packaging logic.

# These all call the same backend defined in pyproject.toml
python -m build          # PyPA's build tool
uv build                 # Astral's uv
hatch build              # Hatch CLI
pip wheel .              # pip as a frontend

Each of these commands reads the [build-system] table, installs the backend into an isolated environment, and asks the backend to produce distribution files. The output is the same regardless of which frontend you use.

Choosing a Build Backend

For a simple pure-Python project, the choice of backend does not dramatically change the resulting package. All compliant backends produce the same standard distribution formats. Where they differ is in their extra features, speed, and integration with broader toolchains.

setuptools

setuptools is the oldest and most widely used backend. According to analyses of top PyPI packages, it powers roughly 79% of all packages, though many of those rely on the legacy fallback rather than an explicit declaration. It supports C and C++ extensions, has an enormous ecosystem of plugins like setuptools-scm for version management, and is well understood across the community.

[build-system]
requires = ["setuptools>=77.0"]
build-backend = "setuptools.build_meta"

The downside is complexity. Setuptools has decades of accumulated features, legacy behaviors, and configuration options. For new pure-Python projects, a more modern backend is usually a better fit.

hatchling

hatchling is a modern, extensible backend that powers the Hatch project management tool. It supports build hooks, granular file selection, and plugins like hatch-vcs for deriving version numbers from Git tags. It is also the default backend recommended by the PyPA packaging tutorial.

[build-system]
requires = ["hatchling>=1.27"]
build-backend = "hatchling.build"

flit-core

flit-core is the minimalist option. It has few dependencies, requires almost no configuration, and is an excellent choice for small, straightforward Python libraries. It does not support C extensions or complex build customization.

[build-system]
requires = ["flit_core>=3.9"]
build-backend = "flit_core.buildapi"

uv_build

uv_build is the newest entrant, released as stable in mid-2025 by Astral, the team behind the uv package manager and the Ruff linter. Written entirely in Rust, benchmarks show it building packages 10 to 35 times faster than hatchling, flit, and setuptools. It is designed for zero-configuration use with sensible defaults and tight integration with the uv toolchain. It currently supports pure Python projects only.

[build-system]
requires = ["uv_build>=0.7.19,<0.8.0"]
build-backend = "uv_build"

Because uv_build is bundled inside the uv binary itself, it can build and publish packages without even having Python installed on the system.

poetry-core

poetry-core is the PEP 517 backend that powers Poetry. Since Poetry 2.0 (released January 2025), it supports the standard [project] table alongside the legacy [tool.poetry] format. If your team already uses Poetry for dependency management, locking, and publishing, staying on poetry-core is a natural choice.

[build-system]
requires = ["poetry-core>=2.2"]
build-backend = "poetry.core.masonry.api"

Specialized Backends

For projects that include compiled code, purpose-built backends are essential. scikit-build-core handles C and C++ extensions using CMake. maturin is the standard for Rust extensions built with PyO3. meson-python uses the Meson build system and is popular across scientific Python packages.

Note

The choice of backend matters more as your project grows in complexity. For a straightforward Python library, any compliant backend produces essentially the same output. The difference shows up when you need build hooks, dynamic versioning, native extensions, or custom file inclusion rules.

Distribution Formats: Wheels and Sdists

When you build a Python package, the output comes in two standard formats: a source distribution (sdist) and a wheel. Understanding the difference between them is essential for effective build control.

Source Distribution (sdist)

An sdist is a compressed .tar.gz archive containing the raw source code, the pyproject.toml file, and any additional files needed to build the package. When someone installs from an sdist, their machine must execute the build backend to produce installable artifacts. This means they need a working build environment with any necessary compilers and tools.

Sdists serve as an archival format and as a fallback for platforms where no pre-built wheel exists. They are also required by conda-forge for building conda packages.

Wheel

A wheel is a pre-built distribution in the .whl format. It is a ZIP archive containing the package's code and metadata, ready for direct installation. Because wheels are pre-built, the installer simply copies files into place without running a compiler or executing any build logic. This makes installation significantly faster, often by an order of magnitude for packages with C extensions.

Wheel filenames encode compatibility information using a structured naming scheme:

{name}-{version}-{python_tag}-{abi_tag}-{platform_tag}.whl

# Examples:
requests-2.31.0-py3-none-any.whl          # Pure Python, any platform
numpy-1.26.4-cp312-cp312-manylinux_2_17_x86_64.whl  # CPython 3.12, Linux x86_64

The py3-none-any tag means the wheel works with any Python 3 interpreter, has no ABI dependency, and runs on any platform. This is what you will see for pure Python packages. Platform-specific wheels include tags for the specific Python version, ABI, and operating system they target.

When pip or uv resolves a package, it prefers wheels over sdists whenever a compatible wheel is available.

Pro Tip

Always publish both an sdist and a wheel to PyPI. The wheel provides fast installation for compatible platforms, while the sdist serves as a universal fallback and is required for conda-forge integration.

Building Your First Package

Here is a complete walkthrough of building a Python package from scratch using modern tooling. This example uses hatchling as the backend and uv as the frontend, though the concepts apply to any combination.

Project Structure

Start with a standard src layout, which is the recommended structure for Python packages:

my-package/
    pyproject.toml
    README.md
    LICENSE
    src/
        my_package/
            __init__.py
            core.py
    tests/
        test_core.py

The src/ directory prevents accidental imports from the local project directory during development and testing. It ensures that tests always run against the installed version of the package.

Writing the pyproject.toml

[build-system]
requires = ["hatchling>=1.27"]
build-backend = "hatchling.build"

[project]
name = "my-package"
version = "0.1.0"
description = "A demonstration package"
readme = "README.md"
requires-python = ">=3.10"
license = "MIT"
dependencies = []

[project.scripts]
greet = "my_package.core:hello"

Writing the Package Code

# src/my_package/__init__.py
from .core import hello

# src/my_package/core.py
def hello():
    print("Hello from my-package!")

if __name__ == "__main__":
    hello()

Building the Distribution Files

With the project structure and configuration in place, building is a single command:

# Using uv
uv build

# Or using PyPA's build tool
python -m build

# Output:
# Successfully built my_package-0.1.0.tar.gz
# Successfully built my_package-0.1.0-py3-none-any.whl

The dist/ directory now contains both the sdist and the wheel. You can inspect the wheel's contents with any ZIP tool since .whl files are standard ZIP archives.

Installing in Development Mode

During development, you want changes to your source code to take effect immediately without rebuilding and reinstalling. This is called an editable install:

# Using pip
pip install -e .

# Using uv
uv pip install -e .

An editable install creates a link from the virtual environment to your source directory. Any changes to Python files are reflected the next time the module is imported, without needing to reinstall.

Publishing to PyPI

Once your package is built, publishing it makes it available to anyone with an internet connection:

# Using uv (supports OIDC trusted publishing)
uv publish

# Using twine (the traditional approach)
twine upload dist/*
Warning

Always test your package on TestPyPI first before publishing to the real PyPI index. Packages published to PyPI are permanent and cannot be re-uploaded with the same version number even if you delete the release.

Key Takeaways

  1. pyproject.toml is the standard: Declare your build system, metadata, and tool configuration in a single pyproject.toml file. The older setup.py and setup.cfg approach is still valid but no longer the recommended starting point for new projects.
  2. Frontends and backends are separate: The tool you run in the terminal (the frontend) is independent from the engine that constructs your package (the backend). PEP 517 and PEP 518 define this interface, and you can mix and match any compliant frontend with any compliant backend.
  3. Pick a backend based on your needs: For pure Python projects, uv_build or hatchling are excellent modern choices. For legacy projects or those with C extensions, setuptools remains the most proven option. For Rust extensions, maturin is the standard.
  4. Always publish both formats: Build and upload both a wheel and an sdist. Wheels provide fast installation, while sdists provide universal compatibility and source code access.
  5. Use the src layout: Placing your package code under a src/ directory prevents subtle import bugs during development and testing, and is now the recommended project structure across the Python packaging ecosystem.

Python's build system has matured significantly. With pyproject.toml as the foundation, a clear frontend/backend architecture, and fast modern tools like uv, packaging a Python project is more straightforward and reliable than it has ever been. Whether you are sharing a utility script with your team or publishing a library to PyPI, understanding build control puts you in full command of the process.

back to articles