With: 8393ceb
This issue is a follow up on #1327 (comment). We've tried to update fast_neural_style example which required bumping up pytorch version and spotted few issues. Findings are around the fact that CI scripts run bulk install of dependencies for all examples at once. See:
|
cat $BASE_DIR/*/requirements.txt | \ |
|
sort -u | \ |
|
# testing the installed version of torch, so don't pip install it. |
|
grep -vE '^torch$' | \ |
|
pip install -r /dev/stdin || \ |
|
{ error "failed to install dependencies"; exit 1; } |
This causes downgrade of nightly torch installed by the CI:
- torch nightly is installed here:
|
pip install --pre torch torchvision -f https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html |
- And gets downgraded (to torch 2.5) due to some examples pinning
torchvision version. For example here:
I suggest to consider the following improvements for pytorch examples:
Alternatively, we can consider that (UPDATE: we've dismissed this after discussion):
- All examples must comply to the single dependency list supported on the pytorch examples repo top level
CC: @malfet, @atalman, @msaroufim
With: 8393ceb
This issue is a follow up on #1327 (comment). We've tried to update
fast_neural_styleexample which required bumping up pytorch version and spotted few issues. Findings are around the fact that CI scripts run bulk install of dependencies for all examples at once. See:examples/utils.sh
Lines 26 to 31 in 8393ceb
This causes downgrade of nightly torch installed by the CI:
examples/.github/workflows/main_python.yml
Line 28 in 8393ceb
torchvisionversion. For example here:examples/dcgan/requirements.txt
Line 2 in 8393ceb
I suggest to consider the following improvements for pytorch examples:
requirements.txt(some miss it asfast_neural_styledoes), Respect each example requirements and use uv #1330requirements.txt, Respect each example requirements and use uv #1330torch,torchvision, etc.) unless as a workaround to specific issues (which must be explicitly noted) or due to example deprecation with the future dropAlternatively, we can consider that (UPDATE: we've dismissed this after discussion):
CC: @malfet, @atalman, @msaroufim