I love decoupling. This makes the project maintaining easier. We have 2 main ways to do it:
- Git submodules. This is a good conception but sometimes very confusable. Also, you must commit updates in parent project for each submodule changing.
- Packaging. I think this solution is better because you already use many packages in your project. You can easily package your project and explain this concept to any junior.
This article about creating python package without pain.
Setuptools
Most python packages which you are using contain setup.py file in its root directory. This file describes package name, version, requirements (required third-party packages), package content and some optional information. Just call setuptools.setup(...)
with this info as kwargs. It’s enough for package distribution. If you have setup.py then you already can distribute it. For example, upload into pypi.org.
Pip and virtualenv
Pip – de facto standard approach for installing python packages in your system. Simple and well known.
By default, pip installs all packages for all users in the system and requires root privileges for it. Don’t sudo pip. Use virtualenv to install packages into isolated environments. Besides security troubles, some packages may have incompatible required versions of some mutual package.
Also I recommend to use pipsi for some global entry points like isort. Yeah, pipsi uses virtualenv.
Editable packages
Sometimes you want to get actual package version directly from your other repository. This is very useful for non-distributable projects. Setuptools doesn’t support it, but you can do it via pip:
pip install -e git+git@bitbucket.org:...git@master#egg=package_name
And you can pin this and any other requirements into requirements.txt:
-e git+git@bitbucket.org:...git@master#egg=package_name
...
deal
Django>=1.11
...
Also, pip supports constraints.txt with the same syntax for pinning versions for optional dependencies:
djangorestframework>=3.5
To install these dependencies just pass them into pip:
pip install -r requirements.txt -c constraints.txt
Requirements.txt is very useful when you don’t want to create setup.py for your internal projects.
Pip-tools
In most commercial projects you have at least 2 environments:
- Development. Here you can get last available packages versions, develop and test project with it.
- Production. Here you are able to create the same environment as where you’re testing this code. And these requirements must not be updated while you’re not testing and improving your code for changes in new versions. Also, other developers are able to get the same environment as you, because it’s already tested and this approach is saving their time.
Pip-tools provide some tools for this conception.
Pipfile and pipenv
Pip developers decided to improve requirements.txt for grouping dependencies and enabling native support for version locking. As a result, they have created Pipfile specification and pipenv – a tool for working with it. Pipenv can lock versions in [Pipfile.lock], manage virtual environments, install and resolve dependencies. So cool, but for distributable packages, you must duplicate main dependencies into setup.py.
Poetry
Poetry – beautiful alternative to setuptools and pip. You can just place all package info and all requirements into pyproject.toml. That’s all. Beautiful. But poetry has some problems:
- It’s not compatible with setuptools. As a result, your users can’t install your project without poetry. Everybody have setuptools, but many users don’t know about poetry. You can use it for your internal projects, but poetry can’t install dependencies from file or repository without
pyproject.toml
generating. Yeah, if you fork and improve some project, you must make sdist for any changes and bump version for all projects that depend on it. Or manually convert project’ssetup.py
topyproject.toml
. - Poetry doesn’t creates a virtual environment if you already into virualenv. So, poetry doesn’t create an environment if you install poetry via pipsi. Pipenv, as opposed to poetry, always creates a virtual environment for a project and can choose the right python version.
- Poetry use version specifiers incompatible with PEP-440. This makes me sad.
For backward compatibility, you can generate setup.py and requirements.txt from pyproject.toml via poetry-setup.
Flit
If pyproject.toml is cool, why only poetry use it? Not only. Flit supports pyproject.toml too. This is a very simple tool with only 4 commands:
- init – interactively create pyproject.toml.
- build – make sdist or wheel.
- publish – upload package into PyPI (or another repository).
- install – install a local package into the current environment.
That’s all. And enough in common cases. Flit uses pip for packages installation. And flit listed in alternatives by PyPA. As in poetry, you need to manage virtual environments by other tools. But this package has a significant disadvantage: flit can’t lock dependencies.
Let’s make the best packaging for your team
All solutions below have some problems. Let’s fix it!
Poetry based packaging
- Always create a virtual environment for each project. I recommend to use pew or virtualenvwrapper for better experience.
- Use pyenv or pythonz for python versions managing. Also, I recommend try to use pypy for some of your small and well-tested projects. It’s really fast.
- Sometimes you want to depend on some setuptools-based projects on some package. Use poetry-setup for compatibility with it.
Pipfile or requirements.txt based packaging
As we remember, with pipenv we need to duplicate all requirements in old format for setup.py. Let’s improve it! I’ve created install-requires project that can help you convert requirements between formats. But which format should you choose?
requirements.txt
. This is the most popular requirements format for projects. Anyone can use it as they want.Pipfile.lock
. Pipenv has better requirements lock than pip-tools, and you should use it for better security. But if you want to create a package from a project, then you plan to use this package into other projects. So, if this project depends on more than one package with locked requirements, pipenv can’t resolve these dependencies. For example, one package lockDjango==1.9
version, but other usesDjango==1.11
. Do not use it for distributable packages: PyPA recommends to place not locked versions into install_requires.Pipfile
. This is our choice. Modern format with some problems, but also with many features. And, most importantly, simple and comfortable. I recommend using it for internal python packages in your company.
Install-requires repository contains example how you can convert requirements from Pipfile to setup.py compatible format on the fly.
Setuptools is dead?
Many developers (me too) love poetry because it uses beautiful project metadata describing format as setup.py alternative. But setuptools allows you to use setup.cfg instead of setup.py and it’s also beautiful. Furthermore, isort and flake8 supports setup.cfg too.
Setuptools supports requirements from VCS, file or archive via dependency_links parameter. And requirements grouping via extras_require.
So, what’s wrong with setuptools? I think this tool has some problems:
- No native virtualenv and python version managing. And poetry can’t do it too. But we have pew, virtualenvwrapper, pyenv, pythonz, and many other useful tools. This is UNIX-way.
- No dependencies locking. Poetry, pipenv, and pip (
pip freeze
) can lock dependencies from its own files. Setuptools can’t. This is because of setuptools for packages, not projects. - Setup.cfg is good, but pyproject.toml better. Setuptools will support pyproject.toml and deprecate setup.py and setup.cfg. Also pip will support it too. And it’s cool!
Further reading
- For pros and cons see issues for projects from article:
- How to install other Python version (sometimes you don’t need pyenv).
- Installing packages using pip and virtualenv.
- Beautiful example for setuptools configuring via setup.cfg.
- Setuptools documentation.
- install_requires vs requirements files.