pip: install packages from Python subprojects with pyproject.toml

658 Views Asked by At

I am trying to find some best practice for the following situation: I have a Python project which has several Python subprojects. Each subproject contain pyproject.toml specifying the dependencies.

Is there some way to automatically link the dependencies of these different pyproject.toml files to the parent repository? Or should I manually specify a parent pyproject.toml where I would have to always copy the up-to-date subprojects' dependencies (not ideal)?

I guess it would be also possible to write a bash script that would go to the subprojects' directories one-by-one and run pip install for respective pyproject.toml but, again, does not seem like most elegant solution to me.

Note that I am asking solution without utilization of Poetry/Conda. I used to use Poetry for this but I am curious if it is manageable without this tool. Unfortunately, all solutions I managed to google about pyproject.toml in such more complex projects, assume usage of Poetry.

UPD:

In ideal situation, I would like to create local wheels or, potentially, to update package to a private repository. For now I will be using the solution I've described (with pip), but the goal is to make it more applicable to packaging.

1

There are 1 best solutions below

0
On

What you are describing could be called a monorepo project or is at least similar enough to apply some lessons and tools.

I've played with this a fair bit in Python and never did find a great solution. I think this is actually an area where Python is just not very strong.

There are a number of small details that can trip you up and many standards will no longer apply. Some examples:

  • CICD tools such as those integrated in to Github, provide a lot of great capabilities for automatic testing, building, publishing, documenting, etc. They are all built with the expectation that you have one Python project per code repository. You will miss out on a lot of free capability if you leave that ecosystem behind.
  • Automatic versioning systems, such as setuptools-scm will be difficult to use (but not impossible). They are heavily biased to the 1 repo - 1 project paradigm.
  • Tools such as Tox will give you a hard time - they will expect external dependencies to have valid and always incrementing version numbers, you will (or at least I did) wind up accidentally running a lot of tests integrated against old versions of packages
  • Managing virtual environments can become pretty messy
  • Dev tools which are designed to be helpful by checking out dev versions of code might accept a Git URL to a package, but get tripped up if the repo has multiple projects

I could probably go on about some hurdles. In the end, I tried to manage everything using the Python package doit. It's a great Make-like tool that can help with some of that (at least the local dev issues).

Use separate repos - In the end, I wish that I had kept everything in separate repositories.

Other build considerations - Regardless of whether you use a single project per code repository, or put them all together, you still have the main challenge of orchestrating everything.

  • A key challenge, in my experience, will be ensuring that you are always running the latest code that you think you are running. If you make a mistake with tox or with installing non-dev versions, you will get surprises

  • Consider adopting a build tool such as Pants. The Pants tool looked pretty good to me but I was not able to use it due to external (organizational) restrictions. But, if you can, try using a modern tool such as that (or Elixer, or doit).

  • Do not try to create your own build system and conventions unless you are interested in many days worth of learning and making frustrating mistakes (and in the end being incompatible with community standards).

If you really don't want to use a build system and you prefer to keep it simple, you can get a pretty long way with this simple version:

  1. each project in its own repo
  2. separate virtual env for each Python 'sub' project
  3. maintain a requirements/local.txt containing a -e ../other_path/pkg line for each cross dependency and use that to maintain dev installs of dependencies across sub projects
  4. keep on top of version numbers and carefully test your packaging by building release versions and do test installs in something such as tox

Good luck