Over time, I've grown to appreciate "pip-tools." Since it's a dead-simple extension of pip, I wish it could be upstreamed into pip itself; that seems like the most straightforward way of fixing a number of Python's packaging issues.
I think a lot of people will wrinkle their nose at pip-tools because it's a more manual workflow, but I really enjoy that aspect of it. Other package managers I've used are convenient at first, but at some point I end up fighting with them and they won't give ground. With pip-tools, at the end of the day I'm in charge.
Plus I really like that the end result is a requirements.txt that can be used with any plain Python+pip environment.
Yup. I like to make it so that each executable has a symlink to a shell script.
The shell script checks if there’s a virtual environment set up with the packages in the requirements.txt installed (it takes a snapshot of the file because virtualenv doesn’t have a DB to query cheaply). Once the environment is set up, it dispatches to running from the virtualenv.
That way when you update a requirements.in file it recompiles it (if the txt is out of date) and installs any new packages, removes packages that shouldn’t be there anymore and updates ones whose version changed (if there’s any changes found). It also lets you trivially run tools with disparate requirements.in without conflicts because each is siloed behind its own virtualenv.
This makes it a trivial experience to use these tools in a shared repo because there’s no worrying about packages / needing to remember to run some command before the right environment is set up. You just modify your code and run it like a regular command-line tool and packages automatically get deployed. It’s also amenable to offline snapshotting / distribution. In fact, I used this to distribute support tooling for factory lines of the Pixel Buds and it worked extremely well.
That's pretty rude and ungenerous considering that I did this in 2015 before poetry even existed. Also, I could be wrong, but briefly taking a look, it seems like it still has the problem that it doesn't automatically update your virtualenv when the dependencies for an executable have changed (i.e. someone merges in code that uses a new dependency, you still have to know to run some commands to update your own local virtualenv).
I work on mutlirepos and i really dislike gits subtree, subrepo i use https://pypi.org/project/zc.buildout/. Yes i know i can do [1]. But editing on multirepos at the same time i can only do with zc.buildout [2] Still not perfect but it does the job.
Ah, I haven't used buildout in years (I remember using it a lot when working with Plone).
I used it for a personal project but gave up a few years ago as some piece of the puzzle seemed broken and abandoned (something hadn't been updated to use a newer version of TLS or something).
I liked buildout though - it was a good system with its bin directory.