If you've ever typed "venv vs virtualenv" into a search bar, you're not alone. It's one of the more common points of confusion for Python developers at every level. Both tools create virtual environments. Both isolate your project dependencies. Both produce a directory with a Python binary and a site-packages folder. So what's actually different — and does the answer still matter in 2026?
The answer goes deeper than many tutorials suggest. It involves the history of Python packaging, formal language specifications (PEPs), real performance trade-offs, and design decisions that still ripple through the ecosystem today. And increasingly, it involves a third tool that is changing how the entire question is framed. This article covers all of it with real code, verified facts, concrete benchmarks, and the context you need to make an informed decision in your own projects.
The Problem Both Tools Solve
Before comparing the tools, it helps to understand the problem they exist to solve — and why the problem is harder than it first appears.
When you install a Python package with pip install requests, that package lands in a global site-packages directory shared by every Python script on your system. This creates a dependency conflict problem: Project A needs requests==2.28.0, but Project B needs requests==2.31.0. Install one, and you break the other.
Worse, on Linux systems, the operating system itself depends on specific versions of Python packages. Overwriting those with pip install can break system tools entirely — a problem so serious that it eventually led to PEP 668, which we'll cover in detail later.
But there's a subtler layer to this problem that many developers don't think about: reproducibility. Without isolation, the packages on your development machine may be entirely different from those on your coworker's machine, your staging server, or your production environment. The same Python file can behave differently in each location because the global environment is different in each location. Virtual environments don't just prevent conflicts — they make environments describable, transferable, and reproducible via a requirements.txt or pyproject.toml.
A virtual environment solves this by creating an isolated directory tree containing its own Python binary and its own site-packages. Packages installed inside that environment are invisible to everything outside it. Simple concept, but the implementation details matter enormously.
virtualenv: The Third-Party Pioneer
virtualenv came first. It was created by Ian Bicking, a prolific Python open-source developer also responsible for pip and WebOb. The earliest ancestor of virtualenv traces back to around 2005 when Bicking worked on tools like non_root_python.py and virtual-python.py, bundled with EasyInstall. These evolved through working-env.py and workingenv before solidifying into the virtualenv we recognize today.
virtualenv works by copying (or symlinking) the Python binary into a new directory, creating the necessary lib/pythonX.Y/site-packages structure, and then installing seed packages like pip, setuptools, and wheel into that environment. When you "activate" the environment, a shell script prepends the environment's bin/ directory to your PATH, so running python or pip points to the isolated copies.
# Install virtualenv (it's a third-party package)
pip install virtualenv
# Create an environment
virtualenv myproject_env
# Activate it (Linux/macOS)
source myproject_env/bin/activate
# Now pip installs go into the isolated environment
pip install flask
# Deactivate when done
deactivate
For years, virtualenv was the only practical way to isolate Python project dependencies. It supported both Python 2 and Python 3, worked across operating systems, and became a foundational tool in the Python packaging ecosystem. As of March 2026, virtualenv is actively maintained — version 21.0.0, a major version with breaking API changes (notably the removal of propose_interpreters from the discovery module, which temporarily broke tools like Hatch), was released on February 25, 2026 alongside the backward-compatible 20.39.x series — and the project continues to track new Python releases including free-threaded CPython 3.13t and early support for Python 3.14. The changelog shows uninterrupted maintenance through 2024 and 2025, including a security fix for CVE-2025-68146 affecting the filelock dependency.
Because virtualenv was a third-party tool operating outside the interpreter, it had to reverse-engineer Python's internal behavior. Every time CPython changed how site.py worked, or a Linux distribution modified the default site-packages layout (like Debian's dist-packages), virtualenv had to play catch-up. It was, by design, a useful hack — but a hack nonetheless. This maintenance burden was one of the primary arguments made in PEP 405 for standardizing virtual environment support at the interpreter level.
PEP 405: Bringing Virtual Environments Into Python Itself
In June 2011, Carl Meyer authored PEP 405, titled "Python Virtual Environments." The PEP explicitly acknowledges virtualenv's pioneering role. As the PEP text puts it, the standard library module draws on accumulated experience with existing third-party tools, with the goal of lower maintenance overhead, higher reliability, and broader availability to all Python users without requiring third-party installation. PEP 405 was accepted and shipped with Python 3.3 in 2012.
PEP 405 introduced two things:
- Interpreter-level support for virtual environments, via a
pyvenv.cfgconfiguration file that the Python binary reads at startup. - The
venvmodule in the standard library, providing a built-in way to create virtual environments without any third-party installation.
The mechanism is elegant. When Python starts, it looks for a file called pyvenv.cfg either adjacent to the executable or one directory above it. If it finds one containing a home key, it knows the binary belongs to a virtual environment. Python then sets sys.prefix to the virtual environment directory while keeping sys.base_prefix pointed at the original installation.
This split between sys.prefix and sys.base_prefix is how you can programmatically detect whether you're inside a virtual environment:
import sys
def in_virtual_environment():
return sys.prefix != sys.base_prefix
print(in_virtual_environment()) # True if inside a venv
Here's what a minimal pyvenv.cfg file looks like inside a virtual environment created by venv:
home = /usr/bin
include-system-site-packages = false
version = 3.12.3
And the basic venv workflow:
# No installation needed --- venv is part of the standard library
python3 -m venv myproject_env
# Activate (Linux/macOS)
source myproject_env/bin/activate
# Activate (Windows PowerShell)
myproject_env\Scripts\Activate.ps1
# Install packages
pip install flask
# Deactivate
deactivate
The Python 3.4 release further improved venv by integrating PEP 453, which introduced the ensurepip module. This meant that newly created virtual environments would automatically have pip available without requiring a separate download step.
As of Python 3.13, venv automatically generates a .gitignore file inside the created environment directory, preventing you from accidentally committing your entire dependency tree to source control. If you want to opt out of this behavior, use python3 -m venv --without-scm-ignore-files myenv.
The Actual Differences: A Technical Comparison
Now that we have the history, here's what actually differs between these two tools in practice.
1. Installation and Availability
venv requires no installation. It ships with Python 3.3+ as part of the standard library. You use it via python3 -m venv. One important caveat: on some Debian and Ubuntu systems, the venv module is split into a separate python3-venv package that you may need to install with apt:
# Debian/Ubuntu --- if python3 -m venv fails
sudo apt install python3-venv
This is not a bug in venv — it's a downstream packaging decision by the distribution maintainers, who split Python's standard library into separately installable components to reduce the default installation footprint. It's worth knowing about because it catches many developers off guard on fresh cloud instances or containers running Debian-based images.
virtualenv is a third-party package installed via pip install virtualenv. This means you need a working Python and pip to get started, which creates a mild chicken-and-egg situation for brand-new installations. In practice this is rarely a problem because pip is bundled with modern Python distributions.
2. Python Version Targeting
venv only creates environments using the Python interpreter that is running it. If you invoke python3.12 -m venv myenv, you get a 3.12 environment. Period. You cannot use it to create an environment targeting a different Python version than the one you're invoking.
virtualenv can discover and target different Python interpreters installed on your system. You specify which one to use with the -p or --python flag:
# Create an environment using a specific Python version
virtualenv -p /usr/bin/python3.10 myenv
# Or using a version specifier (virtualenv resolves it automatically)
virtualenv -p python3.11 myenv
# As of recent virtualenv releases, PEP 440 version specifiers are supported
virtualenv --python ">=3.11,<3.13" myenv
This is a meaningful distinction for developers who test across multiple Python versions. Recent virtualenv releases also added support for PEP 440 version specifiers in the --python flag, meaning you can specify version ranges rather than exact versions or paths.
3. Performance and Environment Creation Speed
This is where the gap is most concrete and measurable.
venv creates environments by symlinking (on Unix) or copying (on Windows) the Python binary and then optionally running ensurepip to install pip. With ensurepip enabled, expect somewhere in the range of 2–5 seconds depending on hardware. If you pass --without-pip, creation is nearly instant but you get an environment without pip installed.
virtualenv version 20 (a major rewrite released in early 2020) introduced an "app-data" seed mechanism that dramatically improved creation speed. Rather than invoking pip to install seed packages from scratch every time, virtualenv builds a cached install image that it can link into new environments. After the cache is warmed, virtualenv can create environments in as little as 350 milliseconds on typical Linux hardware. The cache is stored per Python version and updated automatically in the background every 14 days, ensuring seed packages stay reasonably current without blocking environment creation.
virtualenv v20 was conceived specifically to address creation speed as a pain point for tox and CI workflows, where environment creation happens dozens of times per run. — virtualenv 20.0.0 announcement, Python Packaging Discourse, January 2020
If your workflow frequently spins up and tears down environments — such as in CI pipelines or when using tox — virtualenv's app-data caching can save meaningful time at scale. For large test matrices across four or five Python versions, the cumulative difference across hundreds of CI runs adds up quickly. If you're evaluating whether to switch, time both tools with time virtualenv test_ve and time python3 -m venv test_ve on your actual CI infrastructure after the virtualenv cache is warmed.
4. Seed Packages
When venv creates an environment in Python 3.12 and later, it includes only pip by default. In Python 3.11 and earlier, setuptools was also included. The Python 3.12 release notes confirm that this change was made deliberately: as of pip v22.1, pip no longer requires setuptools to be present in the environment, because pip's PEP 517 mode provides setuptools automatically in isolated build environments when building packages that need it. The official CPython issue tracker entry for this change (gh-95299) makes the reasoning explicit: removing setuptools by default pushes the ecosystem toward standards-based installation workflows and away from the legacy setup.py install path.
What does this mean practically? If you're on Python 3.12+ and you create a venv, then try to install a package that has a hand-crafted setup.py without a proper pyproject.toml, you may encounter unexpected behavior. The fix is simply pip install setuptools inside the activated environment. Packages with modern build configurations are unaffected.
virtualenv seeds environments with pip, setuptools, and wheel by default. It ships with embedded copies of these packages and updates them periodically in the background. The versions in a freshly created virtualenv may be more recent than what ensurepip bundles with your Python installation, since virtualenv's embedded wheels are updated independently of Python release cycles. You can disable any seed package with flags like --no-setuptools or --no-wheel.
5. The EnvBuilder API
venv exposes a Python-level API through venv.EnvBuilder, which allows programmatic creation and customization of environments:
import venv
builder = venv.EnvBuilder(
system_site_packages=False,
clear=True,
symlinks=True,
with_pip=True
)
builder.create("/path/to/new/env")
You can subclass EnvBuilder and override methods like post_setup() to customize what happens after environment creation — for example, automatically installing a set of base packages or writing a custom .pth file into the environment. This is particularly useful for tooling authors who need to create environments programmatically as part of a larger workflow.
virtualenv also has a rich programmatic API, plus a plugin system. The plugin architecture means third parties can extend virtualenv's behavior — for example, virtualenv-pyenv is a plugin that integrates virtualenv's interpreter discovery with pyenv-managed Python installations by inspecting $PYENV_ROOT/versions directly, without requiring pyenv to be on your PATH.
6. Activation Script Coverage
Both tools produce activation scripts for common shells, but their coverage differs. As of Python 3.13, venv generates activation scripts for bash/zsh, csh/tcsh, fish, and PowerShell. virtualenv supports a broader set, including Nushell. If you're running a non-mainstream shell in a team environment, check virtualenv's activation script list before committing to venv.
7. What Happens to the Environment When Python Is Upgraded
This is something very few tutorials discuss, and it creates real confusion in production. Virtual environments created with either tool are not portable across Python minor versions. If you created a venv with Python 3.11 and then upgrade your system Python to 3.12, the environment's binary symlinks may break. The venv module supports in-place environment upgrades via python3 -m venv --upgrade myenv, but this is explicitly intended for patch-level upgrades (3.12.0 to 3.12.3), not minor-version upgrades (3.11 to 3.12). For minor-version upgrades, the safe approach is to delete the environment and recreate it. This is by design — virtual environments are meant to be disposable and recreatable, not long-lived configurations to be migrated.
Never check your virtual environment directory into source control. Both venv and virtualenv produce directories that are tied to the absolute path of the Python interpreter on the machine where they were created. They contain binary files, symlinks into your Python installation, and compiled .pyc files. None of this is portable. The correct thing to commit is your requirements.txt or pyproject.toml, not the environment itself. As noted above, Python 3.13's venv generates a .gitignore inside the environment directory by default to help prevent this mistake.
Questions Nobody Asks But Should
Standard comparisons of these tools stop at the feature list. But there are several questions that almost never get asked — and they change how you think about all of this.
Does activating a virtual environment actually change Python's behavior, or is it just a PATH trick?
Partly a PATH trick, but also genuinely more than that. When you run source activate, the shell script prepends the environment's bin/ to PATH and sets the VIRTUAL_ENV environment variable. That part is a PATH trick. But the more important mechanism is in the Python binary itself. The Python interpreter that lives inside the virtual environment reads pyvenv.cfg at startup and adjusts its internal module search path before any user code runs. This means that even if you invoke the environment's Python binary directly by full path without running any activation script, the isolation still works correctly. Activation is a convenience for interactive shell use, not the mechanism that provides the isolation.
# This works with full isolation, no activation script needed:
/path/to/myenv/bin/python myscript.py
# In a Makefile or CI script, you might write:
/path/to/myenv/bin/pip install -r requirements.txt
/path/to/myenv/bin/python -m pytest
What is VIRTUAL_ENV_PROMPT and when does it matter?
When you create an environment and activate it, the shell prompt shows the environment name in parentheses. By default this is the directory name of the environment. You can override it at creation time:
# Custom prompt label
python3 -m venv --prompt "my-project" myenv
# After activation, the prompt shows:
# (my-project) user@host:~$
This is a small but genuinely useful detail in teams where everyone names their virtual environments something generic like venv or .venv. Setting a meaningful prompt label reduces the chance of accidentally running commands against the wrong project's environment.
Can two projects share a virtual environment?
Technically yes, practically no. Nothing stops you from installing both Project A's and Project B's dependencies into a single virtual environment, but doing so defeats the purpose of isolation and reintroduces the version conflict problem you were trying to escape. The only legitimate shared-environment scenario is a base environment that provides shared development tools (like black, mypy, or pytest) that you want available across projects without installing in each one. In practice, tools like pipx solve this more cleanly by giving each tool its own isolated environment while making the tool's commands globally available.
What does --system-site-packages actually do, and when is it useful?
Both tools support a flag that allows the virtual environment to see packages installed in the system-wide site-packages. By default, this is false, meaning the virtual environment is fully isolated. When set to true, the environment inherits all globally installed packages but can still install additional packages that shadow (override) the global ones.
# Create an environment that can see system packages
python3 -m venv --system-site-packages myenv
This is useful in specific scenarios: system-level packages that include compiled C extensions tied to system libraries (like some database drivers on embedded systems), environments where you want to leverage globally-managed packages like numpy that were installed by the system package manager and linked against BLAS/LAPACK optimizations you don't want to replicate. It is not a best practice for typical development and it breaks reproducibility, but it is the right tool in situations where installing compiled packages from PyPI is impractical.
How do virtual environments interact with pip install --user?
When you are inside an activated virtual environment, pip install --user does not work the way it does outside one. In a standard virtual environment (where user site-packages are not visible, which is the default), pip will refuse the operation entirely with an error: Can not perform a '--user' install. User site-packages are not visible in this virtualenv. This is because PEP 405 defines the per-user site directory as part of the "system" environment, which is excluded from isolated virtual environments. The Python Packaging User Guide notes that the --user flag "has no effect when inside a virtual environment." The practical takeaway: inside a virtual environment, always use plain pip install without --user. The flag is unnecessary and will either be ignored or raise an error depending on your configuration.
Related PEPs That Shape the Landscape
Understanding venv vs virtualenv also means understanding the broader PEP ecosystem that makes virtual environments increasingly central to Python development.
PEP 405 (2011) — Authored by Carl Meyer. The foundational PEP that introduced interpreter-level virtual environment support and the venv module. Shipped with Python 3.3. Defines the pyvenv.cfg mechanism, the sys.prefix / sys.base_prefix split, and the role of the home key. This PEP is the reason both tools produce structurally equivalent environments.
PEP 453 (2013) — Authored by Donald Stufft and Nick Coghlan. Proposed that pip be bootstrapped into Python installations by default via the ensurepip module. Shipped with Python 3.4. This PEP is why python3 -m venv myenv gives you an environment with pip already installed rather than requiring a separate download step.
PEP 517/518 (2015–2017) — Defined the modern build system interface via pyproject.toml, decoupling Python packaging from its historical dependence on setuptools. These PEPs are the direct cause of setuptools being removed from venv's default seed packages in Python 3.12. As pip v22.1 and later can invoke the PEP 517 build backend directly without requiring setuptools to be present in the target environment, the dependency became unnecessary for modern workflows. Legacy packages with hand-written setup.py files that lack a pyproject.toml may still require you to pip install setuptools manually.
PEP 668 (2021, accepted 2022) — Co-authored by Geoffrey Thomas, Matthias Klose, Donald Stufft, Pradyun Gedam, and others. This PEP allows Linux distributions to mark their system Python installation as "externally managed," which prevents pip install from modifying system-level packages. It is directly responsible for the error: externally-managed-environment message that many developers encounter on Debian 12+, Ubuntu 24.04+, Fedora 38+, and macOS with Homebrew-managed Python. The practical effect is that PEP 668 makes virtual environments not just a best practice but effectively mandatory for installing third-party packages on modern systems. You cannot opt out of this behavior with a simple flag; it is enforced by the distribution.
On Debian 12+, Ubuntu 24.04+, Fedora 38+, and macOS with Homebrew Python, running pip install outside a virtual environment will fail with an externally-managed-environment error. This is PEP 668 in action. The workaround of passing --break-system-packages exists but should be treated as a last resort, not standard practice. Always activate a virtual environment before installing third-party packages.
PEP 370 (2008) — Introduced per-user site-packages directories (pip install --user). PEP 405 explicitly considers these as part of the "system" packages, which means they are excluded from isolated virtual environments by default. This is the source of the --user flag behavior described above.
When to Use Which
Given everything above, here's the practical guidance — more specific than the usual "use venv for simple things, use virtualenv for complex things."
Use venv when:
- You are working on a single Python version project and have no need to target a different interpreter version than the one you're running.
- You want zero third-party dependencies in your toolchain — for example, when writing automation scripts or documentation that need to work on any standard Python installation without assuming virtualenv is installed.
- You're building CI/CD pipelines on cloud runners where minimizing installed tools reduces complexity and potential dependency drift.
- Your team is Python-only and you want the answer to "how do I create a virtual environment?" to always be the same one-liner regardless of what's installed.
- You're on Python 3.13+ and want the automatic
.gitignoregeneration without any additional tooling.
Use virtualenv when:
- You need to create environments targeting different Python versions on the same machine. If you're maintaining a library that supports Python 3.9 through 3.13, virtualenv's interpreter discovery makes this substantially easier.
- You're running tox or any test matrix automation that creates and destroys environments repeatedly. The app-data seed cache makes a real difference in aggregate CI time.
- You need virtualenv's plugin system for custom interpreter discovery (e.g., integrating with pyenv via
virtualenv-pyenv). - You need Nushell activation support or other shell coverage not yet included in
venv. - Your team's scripts rely on virtualenv's richer CLI flags for non-standard environment configurations.
Consider that neither tool may be the right answer when: your project already has a manager like poetry, hatch, or uv handling environment lifecycle. These tools create PEP 405-compliant environments under the hood but expose a higher-level interface that combines environment management with dependency resolution and project scaffolding. Reaching for virtualenv directly when you're already inside a poetry or uv project is adding a lower-level tool into a workflow that already handles it — and it creates confusion about which tool "owns" the environment.
A Practical Demonstration
Let's verify the structural similarity between environments created by both tools. This is the kind of hands-on investigation that builds real understanding.
# Create environments with both tools
python3 -m venv test_venv
virtualenv test_virtualenv
# Compare the pyvenv.cfg files
cat test_venv/pyvenv.cfg
cat test_virtualenv/pyvenv.cfg
# Both will contain a 'home' key pointing to the base Python
# Both set include-system-site-packages = false
# virtualenv adds additional keys like 'virtualenv' (version) and 'prompt'
# Compare directory structures
ls test_venv/bin/ # python, python3, pip, activate, etc.
ls test_virtualenv/bin/ # same set, possibly with additional scripts
# Check that sys.prefix differs from sys.base_prefix in both
test_venv/bin/python -c "import sys; print(sys.prefix != sys.base_prefix)"
# Output: True
test_virtualenv/bin/python -c "import sys; print(sys.prefix != sys.base_prefix)"
# Output: True
# Check what seed packages are installed in each
test_venv/bin/pip list
# Output: pip (version from ensurepip)
test_virtualenv/bin/pip list
# Output: pip, setuptools, wheel (versions from virtualenv's embedded cache)
Both environments are PEP 405-compliant. Both produce pyvenv.cfg files. Both result in sys.prefix diverging from sys.base_prefix. The environments are, from the interpreter's perspective, structurally identical. The differences are in how they get created, how fast they get created, what seed packages are included by default, and what tooling sits on top of that creation process.
One additional thing worth verifying: the VIRTUAL_ENV environment variable, which is set by the activation script and used by many tools (including pip, pytest, and IDE integrations like VS Code's Python extension) to detect and locate the active environment:
# After activating either environment:
echo $VIRTUAL_ENV
# Output: /absolute/path/to/your/environment
# Many tools, including pip, check this variable directly
# rather than relying on PATH alone. This is why
# deactivating in one shell doesn't affect a different shell
# that has the same environment activated.
The uv Factor: A Third Answer
Any current treatment of venv vs virtualenv that doesn't address uv is incomplete. Released in early 2024 by Astral (the team behind the Ruff linter and the ty type checker), uv is a Python package manager and project tool written in Rust. It includes a virtual environment creator via uv venv and is explicitly positioned as a replacement for pip, pip-tools, pipx, poetry, pyenv, twine, virtualenv, and more.
The performance numbers are not incremental. Early benchmarks published by Astral showed uv creating a virtual environment with seed packages in approximately 4 milliseconds, compared to around 141 milliseconds for python -m venv and around 30 milliseconds for virtualenv after its app-data cache is warmed. Without seed packages, venv took roughly 1.5 seconds while uv completed the operation in under 20 milliseconds. Exact figures will vary across hardware and Python versions, but the order-of-magnitude advantage is consistent — achieved through uv's Rust implementation, aggressive global wheel caching, and copy-on-write filesystem operations on supported platforms.
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a virtual environment (PEP 405-compliant)
uv venv
# Create one targeting a specific Python version
# uv will download the interpreter if it's not installed
uv venv --python 3.11
# Activate and use normally --- uv envs are standard-compliant
source .venv/bin/activate
pip install flask # or: uv pip install flask
A key architectural difference: uv maintains a global wheel cache across all projects. When you create a new environment and install numpy, the wheel is downloaded once and hard-linked (or copy-on-write-linked) into subsequent environments. This means that on a development machine running ten projects that all use numpy, you're storing and reading one copy of the wheel rather than ten. For teams managing dozens of projects or running large CI fleets, this has measurable storage and bandwidth implications.
uv-created virtual environments are fully PEP 405-compliant. They produce standard pyvenv.cfg files and standard activation scripts. There is no lock-in: you can create an environment with uv, activate it, and use vanilla pip inside it. The environment is not tied to uv in any way. This is worth emphasizing because some developers assume that using a newer tool produces a proprietary artifact. It does not.
Where does uv fit relative to venv and virtualenv? It is not a replacement in the sense of a drop-in command substitution. It is a more opinionated tool: it bundles Python version management, dependency resolution, lockfile generation, and project scaffolding alongside environment creation. If you want a tool that does only the environment creation step and nothing else, venv and virtualenv remain cleaner choices. If you're starting a new project and want a single tool to handle the entire workflow from environment creation through dependency locking and publication, uv is worth serious consideration. As of early 2026, uv has reached version 0.10.x with production-ready features and rapidly growing adoption, particularly in ML and data science teams dealing with large dependency trees where installation speed matters most.
The Bottom Line
The confusion between venv and virtualenv is understandable because they produce the same result through different means. venv is the built-in standard library module that owes its existence to the third-party virtualenv project that proved the concept. virtualenv remains the more feature-rich and performant option for specific workflows, while venv offers the simplicity and reliability of requiring nothing beyond Python itself.
For developers working on modern Python 3 projects with a single interpreter version, venv is sufficient and recommended. The official Python documentation describes venv as the standard tool for creating virtual environments. If you need cross-version testing, faster environment creation in matrix CI pipelines, or virtualenv's plugin ecosystem, virtualenv remains an excellent and actively maintained choice — the v21.0.0 release in February 2026 demonstrates that the project is not going anywhere. (Note: v21.0.0 is a major version with breaking changes to the programmatic API; if your toolchain depends on virtualenv internals, pin to virtualenv<21 until your dependencies have updated.)
And if you're optimizing for raw speed, global caching, or a unified workflow that replaces multiple tools, uv represents where a significant portion of the Python ecosystem is moving. It doesn't make venv or virtualenv obsolete — both remain correct and well-supported tools — but it changes the frame of the comparison by offering a third option with meaningfully different performance characteristics and a broader feature scope.
The underlying specification tying all three together is the same: PEP 405. And with PEP 668 making virtual environments effectively required on modern Linux distributions and Homebrew-managed macOS installations, understanding how these tools work isn't just academic knowledge — it's a practical necessity for anyone writing Python code that needs to be built, tested, and deployed on someone else's machine.