Modern Python Packaging with pyproject.toml and uv

You've built a cool script. It works great on your machine. Now what, how do you share it with the world? More importantly, how do you manage your project's dependencies, build it for distribution, and upload it to PyPI so others can pip install your creation?
This sounds like it should be simple. You wrote the code. The code works. Just... send it to people, right? If you've tried this before, you already know where this is heading. Python packaging has historically been one of those areas where the community spent years arguing about the right approach while developers quietly suffered through setup.py files that ran arbitrary code, confusing MANIFEST.in rules that seemed to have their own hidden logic, and a landscape of competing tools that all claimed to be the modern solution but disagreed on everything. You had distutils, then setuptools, then distribute, then setuptools again, plus pip, pipenv, poetry, flit, wheel, twine, and about a dozen other tools all jostling for position. Just figuring out which combination to use was a full afternoon's work.
Here's the thing though, that era is largely over. The Python community converged on a set of PEPs (Python Enhancement Proposals) that defined a standard configuration format, and then a new generation of tooling showed up that actually made the whole experience pleasant. Today we have pyproject.toml as the unified configuration standard and uv as the modern build and package management tool that makes everything fast and sane. If you're starting a new Python project in 2024 or beyond, this is the stack you want. It handles everything from dependency management during development to building distributable packages to uploading to PyPI, and it does it without making you feel like you're navigating a poorly-documented minefield.
In this article we're going to build a real project from scratch and get it ready for PyPI. We'll cover not just the mechanics but the reasoning behind every decision, because understanding the "why" is what lets you adapt when things don't go exactly as expected.
Table of Contents
- Why pyproject.toml Won
- Why We Need pyproject.toml
- Understanding uv vs pip vs poetry
- The Anatomy of pyproject.toml
- The Build System
- Project Metadata
- Dependencies
- Optional Dependencies
- Entry Points (Console Scripts)
- Building with uv
- Installing Your Package Locally (Editable Mode)
- Building the Distribution
- Versioning and Version Bumping
- Packaging for Distribution
- Installing from PyPI
- Publishing to PyPI
- Creating an API Token
- Configuring uv
- Publishing
- A Complete Example
- Common Packaging Mistakes
- What's Next?
Why pyproject.toml Won
Let's talk history for a moment, because understanding why the old way was broken helps you appreciate why the new way is better.
Back in the early days, Python packaging was handled by distutils, which was built into the standard library. It was functional but bare-bones, so setuptools emerged as a community extension with more features. Packaging a project meant writing a setup.py file, an actual Python script, that called setup() with your project metadata as arguments. This sounds fine until you realize that because it was executable Python code, tools couldn't reliably parse your metadata without actually running it. That made it impossible to safely inspect a package's dependencies without executing potentially arbitrary code, which is a security nightmare. Then setup.cfg came along as a way to move some of that metadata into a declarative format, but you often still needed setup.py for anything beyond the basics. So now you had two files, neither of which was complete on its own, and the relationship between them was confusing.
The Python community finally got serious about fixing this with PEP 517, PEP 518, and eventually PEP 660. These standards defined a way to declare build requirements separately from the package metadata itself, and they specified that the configuration should live in a file called pyproject.toml, a declarative format using TOML (Tom's Obvious, Minimal Language) that any tool could read without executing anything. The key insight was separating the "what is this package" metadata from the "how do we build this package" mechanics, and making both machine-readable in a safe way.
The result is a world where you have one file that tells every tool everything it needs to know about your project. Your formatter reads its config from pyproject.toml. Your linter reads from pyproject.toml. Your test runner reads from pyproject.toml. Your build tool reads from pyproject.toml. You stop hunting through five different config files and start finding everything in one place. That's not a small quality-of-life improvement, over the course of a project, it genuinely reduces cognitive overhead and the "where does this go again?" friction that slows you down.
Why We Need pyproject.toml
If you've poked around Python projects, you've probably seen setup.py or setup.cfg. These files tell Python how to package and distribute your code. The problem? They evolved organically, with different tools doing things slightly different ways.
PEP 517, PEP 518, and PEP 660 changed everything by standardizing on pyproject.toml, a single, declarative configuration file written in TOML (Tom's Obvious, Minimal Language). It's easier to read, language-agnostic, and it means you don't have to write Python code just to describe your project.
Think of pyproject.toml as the single source of truth for your entire project: what it's called, who made it, what it needs to run, and how to build it.
Understanding uv vs pip vs poetry
Before we dive into the configuration, let's get oriented on the tooling landscape, because "which tool should I use" is genuinely one of the first questions people ask and the answer has changed significantly in the last couple of years.
pip is Python's original package installer and it's still everywhere. It's reliable, it's what everyone knows, and it integrates with the ecosystem by default. The main complaints about pip are that it's slow compared to newer alternatives, it doesn't do dependency locking out of the box (you need pip freeze and a separate requirements.txt), and it doesn't manage virtual environments for you. It's a sharp tool that does one job well, but modern workflows often need more than just package installation.
poetry was the dominant "modern" solution for several years and it's still widely used. Poetry bundles dependency management, virtual environment creation, version bumping, and publishing into a single workflow. It introduced poetry.lock for reproducible installs and made the development experience significantly more pleasant than plain pip. The downsides: it's opinionated about project structure, it has its own resolver that sometimes makes different choices than pip, and it's written in Python which means it can itself have dependency conflicts. Poetry is a solid choice, especially if you're working on an existing project that already uses it.
uv is the new kid on the block and it's genuinely impressive. It's written in Rust, which means it's extremely fast, we're talking 10-100x faster than pip for many operations, and noticeably faster than poetry too. It implements the same standards as pip so it works with existing pyproject.toml files and requirements.txt without any migration. It supports editable installs, dependency locking, workspace management, and tool installation. The API is largely compatible with pip, so the learning curve is minimal if you already know pip. For new projects, uv is the clear choice in 2024. It's what we'll use throughout this article.
The short version: if you're starting fresh, use uv. If you're on an existing poetry project, poetry is still fine. If you're on pip, consider whether the speed and workflow improvements of uv are worth the switch for your team.
The Anatomy of pyproject.toml
Let's start by building a simple project. Imagine we're creating a CLI tool called weathercli that fetches weather data.
Here's what a modern pyproject.toml looks like:
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "weathercli"
version = "0.1.0"
description = "A lightweight CLI tool for fetching weather data"
readme = "README.md"
requires-python = ">=3.9"
authors = [
{ name = "Your Name", email = "you@example.com" }
]
license = { text = "MIT" }
keywords = ["weather", "cli", "api"]
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
]
dependencies = [
"requests>=2.28.0",
"click>=8.1.0",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0",
"black>=23.0",
"ruff>=0.1.0",
]
docs = [
"sphinx>=5.0",
]
[project.urls]
Homepage = "https://github.com/yourname/weathercli"
Documentation = "https://weathercli.readthedocs.io"
Repository = "https://github.com/yourname/weathercli.git"
Issues = "https://github.com/yourname/weathercli/issues"
[project.scripts]
weathercli = "weathercli.cli:main"
[tool.hatch.version]
path = "src/weathercli/__init__.py"That's a lot to unpack. Let's break it down section by section.
The Build System
The very first thing in your pyproject.toml is the [build-system] table, and it's arguably the most important piece. This is what PEP 517 and 518 were all about, separating "how do we build this" from "what is this package."
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"This tells Python which tool to use for building. We're using Hatchling, which is modern, fast, and actively maintained. Other options include setuptools and flit, but Hatchling is our pick here because it's simpler and faster than setuptools without sacrificing flexibility.
The requires array lists what needs to be installed to build your project. It runs in an isolated environment, so you don't pollute your system. This is a crucial design decision, your build tools are isolated from your runtime dependencies, which means you can upgrade one without accidentally breaking the other.
Project Metadata
The [project] section is where you describe your package. Every field here ends up on your PyPI page and helps users find and understand your project.
- name: How it appears on PyPI. Must be unique. Typically lowercase with hyphens.
- version: Follows semantic versioning (more on that below). Can be static or read from a file.
- description: A one-liner that appears in search results.
- readme: Path to your README file. Can be markdown or reStructuredText.
- requires-python: Minimum Python version. Users without this version can't install it.
- authors: Who made it. Can list multiple people.
- license: License type. Useful for legal compliance.
- keywords: For searchability on PyPI.
- classifiers: Structured metadata describing the package. PyPI uses these for filtering. The list above tells PyPI this is an alpha project for developers, uses MIT license, and supports Python 3.9-3.12.
Dependencies
Dependencies are where you tell Python what other packages your project needs to run. Get this section right and installation is seamless. Get it wrong and you'll get angry bug reports from users who can't figure out why things are broken.
dependencies = [
"requests>=2.28.0",
"click>=8.1.0",
]These are runtime dependencies, packages required to use your project. Notice the version specifiers (>=2.28.0). This is important:
>=2.28.0means "at least 2.28.0, newer is fine"==2.28.0means "exactly this version">=2.28.0,<3.0means "2.28.0 or newer, but not 3.0+"~=2.28.0means "compatible with 2.28.0" (allows patch updates, same major.minor)
Be thoughtful here. If you're too strict, users can't upgrade other packages. If you're too loose, you might break when dependencies change. The generally-accepted wisdom is to specify a lower bound (the version you've tested against) and let newer versions through unless there's a known breaking change. For well-maintained packages that follow semantic versioning, >=X.Y.0 is usually the right call.
Optional Dependencies
Not every user of your package needs everything. A CLI tool might ship with optional tab-completion support. A data library might have optional visualization dependencies. Optional dependencies let you ship a lightweight core package and let users add what they actually need.
[project.optional-dependencies]
dev = [
"pytest>=7.0",
"black>=23.0",
"ruff>=0.1.0",
]
docs = [
"sphinx>=5.0",
]These are dependencies that aren't required, but are useful for specific purposes. Users can install them with:
pip install weathercli[dev]
pip install weathercli[dev,docs]This keeps your base install lightweight while providing everything people need for development.
Entry Points (Console Scripts)
One of the genuinely delightful features of Python packaging is the entry points system. When you define a console script in your pyproject.toml, Python creates a real system command that users can run from their terminal, without them having to know anything about Python module paths.
[project.scripts]
weathercli = "weathercli.cli:main"This is pure magic. After installation, users get a weathercli command that directly calls the main() function from your weathercli.cli module. No wrapper script needed. The format is command-name = "module.submodule:function". You can define multiple commands this way, a package can expose a weathercli command for end users and a weathercli-admin command for administrators, all from the same pyproject.toml.
Building with uv
Now that our pyproject.toml is set up, let's use uv to build the project. First, verify your project structure:
weathercli/
├── pyproject.toml
├── README.md
├── src/
│ └── weathercli/
│ ├── __init__.py
│ ├── cli.py
│ └── api.py
└── tests/
├── test_cli.py
└── test_api.py
The src/ layout is standard for modern Python projects. It prevents accidental imports of your package before it's installed. This matters more than it sounds, without src/, if you're running tests from the project root and your package directory is also in the root, Python might import your local source instead of the installed version. That can mask bugs that only appear after installation, which is exactly the kind of problem you want to catch before you publish. The src/ layout makes this impossible by design.
Installing Your Package Locally (Editable Mode)
During development, you want to be able to test your code without rebuilding and reinstalling after every change. That's what editable installs are for.
uv pip install -e .The -e flag means "editable." Now when you modify your code, you don't need to reinstall. This is a game-changer for development speed. Under the hood, editable installs work by registering a pointer to your source directory rather than copying your files into the site-packages directory. So when Python imports your package, it reads directly from your source. Edit your code, run the command again, see the result, no reinstall step in the middle.
Building the Distribution
When you're ready to share your package, whether with your team or the whole internet, you need to build it. Building packages creates standardized, installable artifacts that pip and uv can consume.
uv buildThis creates two files in a dist/ directory:
dist/
├── weathercli-0.1.0-py3-none-any.whl # The wheel (binary)
└── weathercli-0.1.0.tar.gz # The source distribution
A wheel (.whl) is a pre-built binary distribution. It installs faster and doesn't require compilation. A source distribution (.tar.gz) is the raw source code; users' pip will build it on install if no wheel is available.
You almost always want both. The wheel is faster for users, the source dist ensures maximum compatibility. Some platforms or configurations can't use a wheel built on a different system (especially if your package includes C extensions), so the source distribution serves as the universal fallback.
Versioning and Version Bumping
Semantic versioning follows MAJOR.MINOR.PATCH:
- MAJOR: Breaking API changes
- MINOR: New features (backward compatible)
- PATCH: Bug fixes
So 0.5.3 means alpha (0.x is pre-release), 5 minor features, and 3 patches.
In our pyproject.toml, we're reading the version from src/weathercli/__init__.py:
# src/weathercli/__init__.py
__version__ = "0.1.0"When you're ready to release, bump the version in that file. uv will read it automatically.
You can also hardcode it in pyproject.toml:
version = "0.1.0"The choice is yours. Reading from a file keeps things DRY (Don't Repeat Yourself), but some teams prefer the explicit approach. There's a third option worth knowing about: dynamic versioning from git tags. With the right Hatchling plugin, your version can be derived automatically from git tags, which means you never have to manually edit a version number, you just tag a commit and the build system figures out the version. This approach scales well for teams doing frequent releases.
Packaging for Distribution
Getting a package ready for public distribution involves more than just writing good code. You're also making promises, about what your package does, how it's licensed, what Python versions it supports, and how it will behave when people install it alongside their other packages. Let's talk about the pieces that matter most.
First, your README is your package's storefront. When someone finds your package on PyPI, the README is the first thing they read. Make it count. Include a clear description of what the package does, a quick-start code example showing the most common use case, installation instructions, and a link to more detailed documentation. A package with a good README gets adopted; a package with a confusing or missing README gets skipped, even if the code underneath is excellent.
Second, your classifiers aren't just metadata for metadata's sake, PyPI uses them to power filtering. If you don't include the right classifiers, your package won't show up when someone searches for Python 3.11-compatible packages, or MIT-licensed packages, or packages in the "Development Tools" category. Spend a few minutes browsing the full classifier list at pypi.org/classifiers/ and pick everything that accurately describes your project.
Third, think hard about what you put in dependencies versus optional-dependencies. The general principle is that dependencies should contain only what's required for the package to do its core job. If you're building a data processing library, requests probably shouldn't be a core dependency unless the library literally can't function without making HTTP calls. Optional extras let users bring in what they need without inflicting unnecessary transitive dependencies on everyone who installs your package. Your users' dependency resolvers will thank you.
Finally, think about what files get included in your distribution. By default, Hatchling includes your source code and your pyproject.toml. But you might also want to include your LICENSE file, your README (Hatchling will include this automatically if you reference it in [project]), and any data files your package needs at runtime. What you probably don't want to include are your test files, your CI configuration, or your editor config files. Keeping your distribution lean makes it faster to install and easier for users to understand.
Installing from PyPI
Once you've published (we'll cover that next), anyone can install your package:
pip install weathercliOr with optional dependencies:
pip install weathercli[dev]Publishing to PyPI
This is where it gets real. You'll need:
- A PyPI account (free at https://pypi.org)
- An API token for authentication (more secure than password)
Creating an API Token
Log into PyPI, go to Account Settings → API tokens, and create a new token scoped to your project. Copy it somewhere safe.
One important note here: use a project-scoped token rather than an account-wide token whenever possible. A project-scoped token can only upload to one specific project. If it gets leaked, the damage is contained. An account-wide token could be used to push malicious releases to any package you own, which is a much bigger risk. This is the kind of security hygiene that separates professional package maintainers from people who end up in breach notifications.
Configuring uv
Tell uv about your token. Create (or edit) ~/.pypirc:
[distutils]
index-servers =
pypi
[pypi]
repository = https://upload.pypi.org/legacy/
username = __token__
password = pypi_YOUR_ACTUAL_TOKEN_HEREOr, set an environment variable:
export PYPI_TOKEN="pypi_YOUR_ACTUAL_TOKEN_HERE"Publishing
uv publishThat's it. Your package is now on PyPI. Verify by visiting:
https://pypi.org/project/weathercli/
Someone in the world can now run pip install weathercli and get your code. That's a genuinely satisfying moment, there's something about seeing your package live on PyPI that makes the whole thing feel real. Before you publish for the first time, it's worth testing the upload process against Test PyPI (test.pypi.org), which is a separate instance of PyPI specifically for testing. You can publish there as many times as you like without it mattering, which lets you verify the whole flow without worrying about polluting the real package index.
A Complete Example
Let's put it all together. Here's a minimal but complete CLI tool:
# src/weathercli/__init__.py
__version__ = "0.1.0"# src/weathercli/api.py
import requests
def get_weather(city: str) -> dict:
"""Fetch weather for a city from a free API."""
url = f"https://wttr.in/{city}?format=j1"
response = requests.get(url)
response.raise_for_status()
return response.json()# src/weathercli/cli.py
import click
from .api import get_weather
@click.command()
@click.argument("city")
def main(city: str):
"""Fetch and display weather for a city."""
try:
data = get_weather(city)
current = data["current_condition"][0]
temp = current["temp_C"]
desc = current["weatherDesc"][0]["value"]
click.echo(f"{city}: {temp}°C, {desc}")
except Exception as e:
click.echo(f"Error: {e}", err=True)
if __name__ == "__main__":
main()Notice how cleanly this separates concerns: api.py handles the HTTP interaction and returns raw data, while cli.py handles user interaction and presentation. This separation makes testing much easier, you can test get_weather() independently of Click, and you can test the CLI output independently of the actual API call. Good packaging goes hand in hand with good code structure.
After installing in editable mode (uv pip install -e .), you can test:
weathercli "New York"Output:
New York: 15°C, Partly cloudy
That single command, weathercli "New York", is doing something impressive under the hood. Python found the installed entry point, called your main() function, which called get_weather(), which made an HTTP request, parsed JSON, and printed a formatted string. The fact that it all feels seamless is the payoff for the setup work we did in pyproject.toml.
Common Packaging Mistakes
Python packaging is one of those areas where the mistakes are subtle enough that you can ship something broken and not realize it until a user files a bug report. Here are the patterns we see most often.
Forgetting src/ layout: Putting your code in the root directory can cause import issues. Use src/ to prevent accidental imports before installation. Without it, your tests might be passing against your local source while hiding bugs that only appear in the installed version. The src/ layout makes the distinction between "my source code" and "installed package" explicit and unavoidable, which is exactly what you want.
Over-constraining versions: requests==2.28.0 is too strict and will cause headaches for users who have other packages that need a different version of requests. Use ranges like >=2.28.0,<3.0 instead. The only time you should pin to an exact version is if there's a known bug in other versions and you can document why the constraint is necessary.
Missing requires-python: Always specify the minimum Python version. Without it, pip will happily try to install your package on Python 2.7 or Python 3.6 and then fail with cryptic errors when it hits f-strings or walrus operators. A clear requires-python = ">=3.9" gives users an immediate, actionable error message instead.
Inconsistent version numbering: Store the version in one place. Reading from __init__.py or a dedicated _version.py keeps things in sync. The worst outcome is having pyproject.toml say 0.2.0 and __init__.py say 0.1.5, then pip show weathercli gives a different answer than import weathercli; print(weathercli.__version__), and users file bugs because they think they're running one version when they're running another.
Skipping tests during development: Use pytest to catch breaking changes before you publish. Once your package is on PyPI, someone might have it pinned in their production environment. Breaking them with a careless patch is a good way to lose users fast.
Including sensitive files in your distribution: Check what's in your dist/ archives before publishing. If your project has a .env file with API keys, make sure it's not getting bundled. A .gitignore doesn't affect what gets included in your wheel, you need to configure your build tool's include/exclude patterns explicitly if you have non-standard files in your project that shouldn't be distributed.
Not testing the install before publishing: Run pip install dist/weathercli-0.1.0-py3-none-any.whl in a clean virtual environment before you push to PyPI. This catches a surprisingly large category of bugs, missing data files, broken imports, entry points that don't work, that only manifest in the installed form of your package.
What's Next?
You've now got a packaged, distributable Python project. You understand the why behind pyproject.toml, you know how to choose between uv, pip, and poetry, you can build and publish to PyPI, and you know what mistakes to avoid along the way. That's a solid foundation.
Here's the thing about packaging though, it's not a one-and-done skill. As your project grows, you'll run into new challenges: managing multiple packages in a monorepo (uv's workspace feature handles this elegantly), automating releases with GitHub Actions so every tagged commit goes straight to PyPI, handling packages with C extensions that need compilation, dealing with platform-specific wheels. Each of these is its own adventure, but they all build on the foundation we've covered here.
The next step in this series, logging and configuration, is crucial for building professional tools. Your CLI needs to tell users what's happening under the hood, and it needs a way to accept configuration without requiring users to edit source code. We'll look at Python's built-in logging module, how to configure it properly, and how to use environment variables and configuration files to make your tools flexible without being complicated. See you there.