-
Notifications
You must be signed in to change notification settings - Fork 31
Open
Description
Hi, our project uses pip-compile-multi for Python dependency management. Basically, we have a requirements.in
, which is used to generate a locked requirements.txt
file. There are several related issues in the current pipeline. I'm looking for advice on how to solve them better.
- If we need to change a dependency, e.g. add a new one, running pip-compile-multi (and so underlying pip-compile) leads to building all the dependencies in all the files. Would it be possible to just add a package without rebuilding everything (e.g. check the new package deps against the existing
requirements.txt
, update what's necessary, update the file hash) or at least avoid rebuilding some packages (which may not be available on some platforms because of missing build deps, binaries, libs etc.)? I tried using--upgrade-package "<new package>"
and--no-upgrade
, but it didn't seem to help. - One of the dependencies in our project, namely
av
, removed binary wheels from pypi at some point, so we're building the package ourselves. The package build depends on Cython being < 3.0.0, so listing it in therequirements.in
doesn't help. - There's a possible solution with the
pip-compile
command - supplying--pip-args "--no-build-isolation 'av==9.2.0'"
seems to help with building theav
package, after the build environment is set up before running the command. I could find a similar option in pip-compile-multi, but it seems to be global for all the packages.
Metadata
Metadata
Assignees
Labels
No labels