We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 8aa8b9f commit a9366d6Copy full SHA for a9366d6
cu128-megapak-pt28/Dockerfile
@@ -236,7 +236,7 @@ RUN --mount=type=cache,target=/root/.cache/pip \
236
pip install -U uv \
237
# Nunchaku (binary pair with PyTorch)
238
&& pip install \
239
-https://github.com/nunchaku-tech/nunchaku/releases/download/v1.1.0/nunchaku-1.1.0+torch2.8-cp312-cp312-linux_x86_64.whl \
+https://github.com/nunchaku-ai/nunchaku/releases/download/v1.2.0/nunchaku-1.2.0+torch2.8-cp312-cp312-linux_x86_64.whl \
240
# FlashAttention (version pair with xFormers, binary pair with PyTorch & CUDA)
241
242
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu128torch2.8-cp312-cp312-linux_x86_64.whl \
0 commit comments