Skip to content

Updating a specific dependency #515

@zhiltsov-max

Description

@zhiltsov-max

Hi, our project uses pip-compile-multi for Python dependency management. Basically, we have a requirements.in, which is used to generate a locked requirements.txt file. There are several related issues in the current pipeline. I'm looking for advice on how to solve them better.

  1. If we need to change a dependency, e.g. add a new one, running pip-compile-multi (and so underlying pip-compile) leads to building all the dependencies in all the files. Would it be possible to just add a package without rebuilding everything (e.g. check the new package deps against the existing requirements.txt, update what's necessary, update the file hash) or at least avoid rebuilding some packages (which may not be available on some platforms because of missing build deps, binaries, libs etc.)? I tried using --upgrade-package "<new package>" and --no-upgrade, but it didn't seem to help.
  2. One of the dependencies in our project, namely av, removed binary wheels from pypi at some point, so we're building the package ourselves. The package build depends on Cython being < 3.0.0, so listing it in the requirements.in doesn't help.
  3. There's a possible solution with the pip-compile command - supplying --pip-args "--no-build-isolation 'av==9.2.0'" seems to help with building the av package, after the build environment is set up before running the command. I could find a similar option in pip-compile-multi, but it seems to be global for all the packages.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions