Part IX · Build, Deploy, Operate
Chapter 105 ~18 min read

Python tooling: uv, ruff, black

"Python's packaging story was a graveyard. Then Astral showed up with a bulldozer"

Python is half of every ML stack, and until 2024 its tooling was a disaster. A new engineer joining a Python project faced a choice between pip + virtualenv, pipenv, poetry, conda, hatch, pdm, rye, and half a dozen others. Each had its own lockfile format, its own philosophy, its own bugs, its own performance problems. The “right” answer kept changing every two years. Then Astral shipped uv, and the conversation ended.

By the end of this chapter the reader understands why uv replaced everything that came before, what ruff does that deserves the phrase “all-in-one linter,” where black fits in (and whether it still matters given ruff format), and what the canonical modern Python project layout looks like. The chapter is narrow — just the tooling — but the tooling touches every chapter in Part IX and every Python service the reader will ever deploy.

Outline:

  1. Why Python packaging was broken.
  2. uv in one paragraph, then in detail.
  3. uv.lock and lockfile semantics.
  4. Virtual environments done right.
  5. pyproject.toml — the modern project layout.
  6. ruff — the all-in-one linter.
  7. ruff format vs black.
  8. Pinning Python versions — uv’s Python management.
  9. uv for Docker builds and the speedup.
  10. The CI pipeline for a modern Python project.
  11. The mental model.

105.1 Why Python packaging was broken

The short version of the history. Python’s standard tool, pip, was designed to install packages from PyPI, not to manage project environments. It had no lockfile, no dependency resolver worth using (until 2020’s “pip 20.3” rewrite), and no concept of project metadata beyond setup.py, which was an executable script that ran arbitrary code. Every team built their own wrapper.

Then the wrappers proliferated. virtualenv for isolated environments. pipenv (2017) added a Pipfile.lock and promised to be the “one tool,” but its resolver was slow and buggy and the project slowed down. poetry (2018) took a cleaner approach with pyproject.toml, fast-ish resolution, and a real lockfile, and became the de facto standard for modern Python projects. conda ruled the ML world because it handled non-Python dependencies (CUDA, MKL, OpenBLAS) that pip could not.

None of them were fast. Poetry’s install on a fresh project with a hundred dependencies took several minutes. The resolver frequently produced unusable graphs. Lockfiles were huge, slow to regenerate, and the diffs were unreadable. pip install -e . for local development was a black box that broke silently. The ML stack, where a typical requirements file has PyTorch, NumPy, pandas, scikit-learn, transformers, and fifty transitive CUDA-linked dependencies, was the worst-hit — installing a new virtualenv for an ML project could take ten minutes.

The reason the ecosystem was stuck is that Python packaging is hard. PyPI has ~500k packages; the metadata is inconsistent; wheels have to be matched to the right Python version, ABI, platform, and (for native code) the right libc; the legacy of setup.py means some packages execute arbitrary code during install. Writing a correct, fast resolver that handles all of this requires serious engineering work, and nobody wanted to do it. Until Astral.

105.2 uv in one paragraph, then in detail

uv is a package manager written in Rust by Astral (the same team that made ruff). It is 10-100× faster than pip, poetry, and pipenv across every benchmark. It implements the PEP 621 standard for project metadata, has a real lockfile (uv.lock), manages virtual environments, installs Python itself, and replaces the whole toolchain with one binary. Released in early 2024 and stable by mid-2024, it became the default for new Python projects within a year. It is the first tool in Python’s history that is simply better than everything it replaces, with no tradeoffs worth discussing.

Now the details. uv has four main subsystems that replace the previous toolchain:

  1. Dependency resolution and installation. uv pip install is a drop-in replacement for pip install that is 10-100× faster. uv sync installs the project and its dependencies from pyproject.toml + uv.lock into a virtualenv.
  2. Lockfile and project management. uv add numpy adds a dependency to pyproject.toml, resolves the graph, and updates uv.lock. uv lock regenerates the lockfile without installing.
  3. Virtual environment management. uv venv creates a virtualenv. uv run pytest runs a command in the project’s virtualenv without needing to activate it manually.
  4. Python installation. uv python install 3.12 downloads a pre-built Python 3.12 from the python-build-standalone project. No more pyenv install or conda install python=3.12. Python becomes a managed dependency, like any other.

The speed comes from several sources: a parallel downloader, aggressive caching of the package index, a SAT-like resolver that backtracks efficiently, content-addressable storage of installed packages (so the same wheel is stored once across all projects), and the Rust implementation that avoids Python’s startup tax. For a realistic ML project with ~80 dependencies, a clean install that takes poetry 3-5 minutes takes uv 10-20 seconds.

uv install time vs poetry and pip for a 80-dependency ML project: uv is 10-100x faster due to parallel downloads and a content-addressable cache. 0s 60s 120s 180s pip ~150s poetry ~90s uv ~15s clean install, ~80 dependencies (ML project). Times are representative.
uv's Rust resolver, parallel downloader, and content-addressable wheel cache produce 10–100× faster installs than pip or poetry — the difference compounds across every CI run and Docker build in the team's pipeline.

105.3 uv.lock and lockfile semantics

A lockfile is a frozen resolution of the dependency graph — every direct and transitive dependency, pinned to an exact version, with a hash of the wheel. The point is reproducibility: anyone who installs from the lockfile gets byte-identical packages. This is the foundation of “it works on my machine” actually meaning “it works on every machine.”

uv’s lockfile is uv.lock, a TOML file that lists every package, its version, its source (PyPI, git, local path), the hashes of the installable artifacts, and the resolved dependency markers (Python version, platform). A simplified excerpt:

version = 1
requires-python = ">=3.12"

[[package]]
name = "numpy"
version = "2.1.3"
source = { registry = "https://pypi.org/simple" }
wheels = [
    { url = "https://files.pythonhosted.org/packages/.../numpy-2.1.3-cp312-cp312-manylinux_2_17_x86_64.whl", hash = "sha256:abc..." },
    { url = "https://files.pythonhosted.org/packages/.../numpy-2.1.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:def..." },
]

[[package]]
name = "torch"
version = "2.5.1"
source = { registry = "https://download.pytorch.org/whl/cu121" }
dependencies = [
    { name = "filelock" },
    { name = "numpy" },
    { name = "sympy" },
    # ...
]

The lockfile is multi-platform. It records wheels for Linux, macOS, Windows, and different architectures, so the same lockfile works on a macOS laptop and a Linux CI runner. This is a step up from Poetry, which had constant problems with platform-specific resolutions.

Key behaviors:

  • uv sync installs exactly what’s in uv.lock. Deterministic. Used in CI and Docker builds.
  • uv sync --frozen is the same but fails if the lockfile is out of sync with pyproject.toml. Use this in CI to catch drift.
  • uv lock regenerates the lockfile. Used when dependencies change.
  • uv add package adds a dependency and updates the lockfile in one step.
  • uv upgrade package bumps the version within the version constraints in pyproject.toml.

The discipline: pyproject.toml is the source of truth for what’s allowed ("numpy>=2.0"), uv.lock is the pinned resolution (numpy==2.1.3). Both are checked in. CI runs uv sync --frozen to guarantee no drift. This is the same contract-first discipline as Wire (Chapter 103) or buf (Chapter 104): the spec and the resolved output are both versioned and both checked.

105.4 Virtual environments done right

Virtualenvs exist because system Python is shared and version-pinning packages globally is a disaster. A virtualenv is a directory containing a Python interpreter symlink and a site-packages/ directory that the interpreter uses instead of the system one. Activating a virtualenv means prepending its bin/ to $PATH so python and pip point to the venv’s copies.

uv makes virtualenvs almost invisible. You rarely activate one manually; you just run uv run <command> and uv handles the rest. By convention, uv creates the virtualenv in .venv/ at the project root. Every uv run command ensures the venv is up to date with the lockfile (or creates it if it doesn’t exist) and executes the command inside it.

uv run pytest                    # runs pytest in the project venv
uv run python -m src.server      # runs the service
uv run --with rich python -c 'from rich import print; print("hi")'  # ad-hoc

This workflow is transformative. source .venv/bin/activate was always a footgun — forgetting to activate meant running commands against system Python and getting confusing errors. With uv run, the venv is implicit, always correct, and reproducible. For scripts and one-off commands, uv run with --with lets you add a dependency ephemerally without polluting the project.

For container builds (Chapter 102), the virtualenv lives inside the image. The multi-stage pattern is:

FROM python:3.12-slim AS builder
RUN pip install --no-cache-dir uv==0.5.14
WORKDIR /app
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-dev --no-install-project
COPY src/ ./src/
RUN uv sync --frozen --no-dev

FROM gcr.io/distroless/python3-debian12:nonroot
WORKDIR /app
COPY --from=builder /app/.venv /app/.venv
COPY --from=builder /app/src /app/src
ENV PATH="/app/.venv/bin:$PATH"
USER nonroot
ENTRYPOINT ["python", "-m", "src.server"]

The builder stage installs uv, syncs deps to a venv in /app/.venv, and copies the source. The runtime stage is a distroless Python image with just the venv and the source. The image is 100-300 MB depending on dependencies — painfully large by Go standards but tiny by traditional Python-in-Ubuntu standards.

105.5 pyproject.toml — the modern project layout

pyproject.toml is the standard Python project metadata file, defined by PEP 518 and PEP 621. It replaces setup.py, setup.cfg, requirements.txt, and Pipfile. A canonical modern Python project looks like:

my-service/
├── pyproject.toml
├── uv.lock
├── README.md
├── src/
│   └── my_service/
│       ├── __init__.py
│       ├── main.py
│       └── handlers.py
└── tests/
    ├── __init__.py
    └── test_handlers.py

The src/ layout (source inside a src/ directory, not at the top level) is the standard. It avoids the “you imported the local copy instead of the installed one” bug that plagued flat layouts — pytest can’t accidentally import my_service from ./my_service because there is no such directory.

A full pyproject.toml for a modern service:

[project]
name = "my-service"
version = "0.1.0"
description = "An HTTP service."
requires-python = ">=3.12"
dependencies = [
    "fastapi>=0.115",
    "uvicorn[standard]>=0.32",
    "pydantic>=2.9",
    "httpx>=0.27",
    "structlog>=24.4",
]

[project.optional-dependencies]
dev = [
    "pytest>=8.3",
    "pytest-asyncio>=0.24",
    "ruff>=0.8",
    "mypy>=1.13",
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.uv]
dev-dependencies = [
    "pytest>=8.3",
    "pytest-asyncio>=0.24",
    "ruff>=0.8",
    "mypy>=1.13",
]

[tool.ruff]
line-length = 100
target-version = "py312"

[tool.ruff.lint]
select = ["E", "F", "W", "I", "UP", "B", "SIM", "C4", "N", "ARG", "PL"]
ignore = ["E501"]

[tool.mypy]
strict = true
python_version = "3.12"

Every tool reads its config from pyproject.toml. No more scattered .flake8, setup.cfg, pytest.ini, mypy.ini files. One file, one source of truth.

105.6 ruff — the all-in-one linter

Ruff is the second piece of the Astral stack. Before ruff, Python linting meant running flake8, pylint, isort, pyflakes, pycodestyle, pydocstyle, bandit, and maybe half a dozen others. Each had its own config, its own pace of development, its own speed. Running them on a large codebase took minutes. Some were fast (flake8), some were painfully slow (pylint), and the rules overlapped inconsistently.

Ruff re-implemented every one of these linters in Rust, in one binary. As of 2025 it has 800+ rules covering everything flake8, pylint, isort, bandit, pydocstyle, pyupgrade, and more do. It runs the full suite on a ~100 kLOC project in under a second. For comparison, pylint on the same project is a 90-second wait.

graph LR
  Old[Old linter zoo] -->|replaced by| Ruff[ruff<br/>one binary]
  subgraph Old
    F[flake8] 
    P[pylint]
    I[isort]
    B[bandit]
    BL[black]
  end
  subgraph Ruff
    direction TB
    R1[800+ rules]
    R2[sub-second on 100 kLOC]
    R3[autofix mode]
    R4[ruff format<br/>black-compatible]
  end
  style Ruff fill:var(--fig-accent-soft),stroke:var(--fig-accent)

Ruff collapses five independent linter processes into one Rust binary — the speed makes it viable to run on every keystroke in an editor, shifting lint catches from CI to developer time. The speed is what makes it a new kind of tool — you can run ruff on every keystroke in an editor, not just in CI.

Selecting rule sets is done via short codes:

  • E, W — pycodestyle errors and warnings
  • F — pyflakes
  • I — isort (import sorting)
  • N — pep8-naming
  • UP — pyupgrade (modernize syntax)
  • B — flake8-bugbear (common bugs)
  • C4 — flake8-comprehensions
  • SIM — flake8-simplify
  • ARG — flake8-unused-arguments
  • PL — pylint
  • S — flake8-bandit (security)
  • ASYNC — flake8-async

A reasonable default for a new project: ["E", "F", "W", "I", "UP", "B", "SIM", "C4", "N"]. Add PL and ARG if you want stricter. Add S if security matters. Start tight, relax as needed.

Ruff also has an autofix mode (ruff check --fix) that rewrites code for any rule that has a mechanical fix. ruff check --fix . will reorder imports, remove unused ones, upgrade syntax from List[int] to list[int], convert %s formatting to f-strings, etc. For migrating an old codebase to modern Python, this is a miracle.

The upshot: ruff replaces the old zoo of linters with one binary, configured in pyproject.toml, with speed that makes it feasible to run on every save. It is the most impactful dev-tools change Python has had in the last decade.

105.7 ruff format vs black

Black was the first popular opinionated formatter for Python (2018). Its motto was “the uncompromising code formatter” — you configure nothing, black just formats your code the One True Way, and the bikeshedding ends. Black’s adoption curve was fast because the debate about tab width and quote style was exhausting and black simply ended it.

Ruff added a formatter in 2024 (ruff format) that is designed to be byte-compatible with black. The output is identical or nearly identical for the same code, but ruff format is ~30x faster. Same philosophy, same rules, different implementation.

The state of the art in late 2025: new projects use ruff format because it’s faster and is in the same binary as the linter. Existing black projects can migrate with zero changes — the output is compatible. Some teams still use black out of institutional inertia, and that’s fine; the result is identical.

The point is that the formatting debate is over. Every Python project should use either black or ruff format, in CI, as a hard gate. PRs that don’t pass the formatter don’t merge. Arguing about style is wasted energy when the tool has an opinion and you can just take it.

Run it:

uv run ruff format .
uv run ruff check --fix .

Two commands. The codebase is now formatted and lint-clean. CI runs the same commands with --check flags:

uv run ruff format --check .
uv run ruff check .

and fails if either has changes to make. This is the minimum CI surface for a Python project in 2025.

105.8 Pinning Python versions

uv manages Python itself, which is new. Previously you used pyenv or conda or just apt install python3.12. Each had problems: pyenv required compiling Python from source (slow, fragile); conda was its own ecosystem; system Python was tied to the OS release.

uv uses the python-build-standalone project, which publishes pre-built Python binaries for every platform. uv python install 3.12 downloads a ~30 MB tarball and extracts it. No compilation. It works on Linux, macOS, Windows. The Python version is written to pyproject.toml via requires-python, and uv ensures the right version is used.

uv python install 3.12
uv python list                 # see what's installed
uv python pin 3.12             # pin this project to 3.12
uv venv --python 3.12          # create a venv with that version

The .python-version file (pinned by uv python pin) tells uv which version to use in this project. CI runners just need uv installed; they don’t need Python pre-installed. The uv binary handles everything.

This is a subtle but transformative change. Before, “the Python version for this project” was a loose convention enforced by README notes and Docker base images. Now it’s a pinned artifact, managed by the tool, the same way Go modules pin their Go version in go.mod. One more source of “works on my machine” bugs eliminated.

105.9 uv for Docker builds and the speedup

Inside a Docker build, uv’s speed matters more than anywhere else. A typical Python image rebuild with poetry takes 3-10 minutes dominated by dependency installation. With uv, the same build takes 30-90 seconds. Multiply that across a CI pipeline that rebuilds images on every commit, and the time savings are hours per week for a busy team.

The Docker layering pattern (from §105.4, repeated with more detail):

# Stage 1: install dependencies
FROM python:3.12-slim AS builder
RUN pip install --no-cache-dir uv==0.5.14

WORKDIR /app

# Copy only the files needed for dep resolution first.
# This layer is cached unless pyproject.toml or uv.lock change.
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-dev --no-install-project

# Now copy the source and install the project itself.
# This layer invalidates on every source change but is fast.
COPY src/ ./src/
RUN uv sync --frozen --no-dev

# Stage 2: runtime
FROM gcr.io/distroless/python3-debian12:nonroot
WORKDIR /app
COPY --from=builder /app/.venv /app/.venv
COPY --from=builder /app/src /app/src
ENV PATH="/app/.venv/bin:$PATH"
ENV PYTHONPATH="/app"
USER nonroot
ENTRYPOINT ["python", "-m", "my_service.main"]

The key is the two-step uv sync: first --no-install-project installs only the dependencies (cacheable), then another uv sync after copying source installs the project itself (fast because dependencies are already there). On unchanged dependencies, the first layer hits the Docker cache and the second layer runs in under a second.

Additional speedups:

  • Cache mounts: RUN --mount=type=cache,target=/root/.cache/uv uv sync ... caches the uv download cache across builds.
  • Compiled bytecode: uv sync --compile-bytecode pre-compiles .pyc files during install, saving startup time at container runtime.

The final image size is determined by dependency size. An image with PyTorch, transformers, and CUDA libraries is going to be 5+ GB no matter what tool you use. An image with just FastAPI and some stdlib is under 100 MB. The tool does not change the dependency footprint; only the compile-and-install time.

105.10 The CI pipeline for a modern Python project

Putting it together, a complete CI pipeline for a Python service looks like:

# .github/workflows/ci.yml (GitHub Actions flavor, adapt to your CI)
name: ci
on: [push, pull_request]
jobs:
  check:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Install uv
        uses: astral-sh/setup-uv@v3
        with:
          version: "0.5.14"

      - name: Install Python
        run: uv python install 3.12

      - name: Sync dependencies (frozen)
        run: uv sync --frozen --all-extras

      - name: Format check
        run: uv run ruff format --check .

      - name: Lint
        run: uv run ruff check .

      - name: Type check
        run: uv run mypy src/

      - name: Test
        run: uv run pytest --cov=src --cov-report=xml

      - name: Build Docker image
        run: docker build -t my-service:${{ github.sha }} .

Every step is one command. uv sync --frozen guarantees reproducibility. ruff format --check and ruff check gate style and lint. mypy gates types. pytest gates tests. Docker build gates the image. Total runtime on a medium-sized project: 1-3 minutes, dominated by tests.

Compare to the pre-uv world: 5-10 minutes of pip install, cryptic errors about wheel hashes, flaky network downloads, a different flake8 configuration than the developer was using, and half the time a version of black in CI that disagreed with the developer’s local version. All of that is gone with uv + ruff. The Python CI pipeline is finally as fast and as reliable as the Go or Rust ones.

105.11 The mental model

Eight points to take into Chapter 106:

  1. uv replaced the entire Python packaging toolchain. One binary for deps, venvs, Python installation.
  2. 10-100× speedup over poetry/pip. The first tool that makes Python CI feel fast.
  3. uv.lock is the real lockfile. Multi-platform, reproducible, checked in.
  4. uv sync --frozen in CI and Docker. Deterministic installs, no drift.
  5. uv run <cmd> replaces virtualenv activation. Implicit, correct, reproducible.
  6. ruff is the all-in-one linter. 800+ rules, sub-second runtime, autofix mode.
  7. ruff format is black-compatible and faster. Formatting debate is over.
  8. Modern Python project = pyproject.toml + src/ layout + uv + ruff + mypy + pytest. Everything in one config, one tool, one CI pipeline.

Chapter 106 shifts from building the artifact to what happens to it after: the OCI image lifecycle, digests, and the registry pipeline that moves images from build to deploy.


Read it yourself

  • Astral’s uv documentation (docs.astral.sh/uv). Short and dense. Read “Projects” and “Tools” sections first.
  • Astral’s ruff documentation (docs.astral.sh/ruff). The “Rules” reference lists all 800+ rules with examples.
  • PEP 621, Storing project metadata in pyproject.toml. The canonical reference for the project metadata format.
  • The python-build-standalone project README on GitHub. Explains how the portable Python binaries are built.
  • Effective Python, 3rd edition (Brett Slatkin). Broader Python craft reference; the tooling chapter mentions uv and ruff.
  • Charlie Marsh’s RustConf 2023 talk Python tooling could be much, much faster. The origin-story pitch for ruff and uv.

Practice

  1. Create a new Python project with uv init my-service, add fastapi and httpx, and examine the generated pyproject.toml and uv.lock.
  2. Run uv run pytest on an empty project. Where does the virtualenv live? What files did uv create?
  3. Write a pyproject.toml with strict ruff rules enabled (["E", "F", "W", "I", "UP", "B", "SIM", "C4", "N", "PL", "ARG"]). Run ruff check on a sample file.
  4. Measure the time difference between poetry install and uv sync on the same set of dependencies (e.g., the deps of a typical FastAPI project). Report the ratio.
  5. Write a multi-stage Dockerfile using uv that produces a distroless Python image for a FastAPI app. Measure the final image size.
  6. Pin a project to Python 3.12 with uv python pin. What files were created? What happens if you try to run it with uv python install 3.11?
  7. Stretch: Migrate an existing Poetry project to uv. Document what changed in the lockfile, the install time, and the CI pipeline.