Back

Vanilla Python Packaging

Whenever I build a project I try to use as many standardized, vanilla tools as I can. I like a minimal workflow and don’t want to learn several third-party tools just to package and deploy my project. The Python community has many different packagers and generally agrees that there are too many.

I wanted to share my setup which is sufficient for my needs and is able to do everything I need including

  • Easy local development & testing with VSCode
  • Using built-in Python tools whenever possible
  • Simple deployment process
  • Easy to set up CI with Github Workflows

First of all, I have nothing against Poetry, uv, pdm, and others. I just prefer not to learn more tools, more configurations, and use only standardized built-in tools wherever possible. So as a result, I use setuptools only.

Here’s my pyproject.toml, in a project where I use FastAPI to create a web server:

[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"

[project]
name = "MyProject"
version = "0.0.1"
description = "MyProject"
authors = [
  { name = "Lev Dubinets", email = "myemail@domain.com" },
]
license = { file = "LICENSE" }
readme = "README.md"
classifiers = [
    "Programming Language :: Python :: 3",
    "License :: OSI Approved :: MIT License",
    "Operating System :: OS Independent",
]
requires-python = ">=3.7"
dependencies = [
  "fastapi==0.100.1",
  "uvicorn==0.23.2",
  "pydantic>=1.2.0,<2.0.0",
  "build==0.10.0",
  # ... more stuff

  # workers
  "celery==5.3.4",
  "redis==5.0.1",
  "flower==2.0.1",
  # ... more stuff

  # migrations
  "alembic==1.12.0",

  # devtools
  "isort==5.12.0",
  "pre-commit==3.4.0",
  "autoflake==2.2.1",
  "black==23.7.0"
]

[project.scripts]
myproject-alembic = "myproject.migrations.apply:main"

[tool.isort]
profile="black"
src_paths = ["myproject"]

[tool.black]
line_length=120

# anything else

The associated file system structure looks like

├── dist
│   └── stuff here ..
├── scripts
│   └── stuff here ...
├── services
│   └── systemd .service files here ...
├── src
│   └── myproject
├── MANIFEST.in
└── pyproject.toml

My build step is then just python3 -m build. This populates the dist/ directory with everything needed to install and run the project on a server.

To deploy, I simply rsync the dist/ directory, and then run pip install --force-reinstall *.whl on the server, after which I can restart the systemd service and everything is fully deployed.

My service file looks like this:

[Unit]
Description=MyProject by Dub
After=network.target
RequiresMountsFor=/deploy/myproject/venv/bin/

[Service]
Type=simple
User=ubuntu
Group=ubuntu
DynamicUser=true
ExecStart=/deploy/myproject/venv/bin/uvicorn myproject.main:app --host 0.0.0.0 --port 8000

[Install]
WantedBy=multi-user.target

I also have a simple github workflow for CI:

name: Deploy To Prod

on:
  push:
    branches: [master]

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: Set up Python 3.11
        uses: actions/setup-python@v4
        with:
          python-version: '3.11'
          cache: 'pip'
      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip build
      - name: Build
        run: ./scripts/build.sh
      - name: Sync unit
        run: ./scripts/deploy-units.sh
      - name: Deploy binaries
        run: ./scripts/deploy-binaries.sh
      - name: Restart unit
        run: ./scripts/restart-units.sh

And thats it! No need to install many different tools and workflows. Just python, pip, rsync, and systemd are enough to build and deploy a production ready python service.