File tree Expand file tree Collapse file tree 2 files changed +4
-7
lines changed
Expand file tree Collapse file tree 2 files changed +4
-7
lines changed Original file line number Diff line number Diff line change @@ -296,17 +296,12 @@ RUN --mount=type=cache,target=/root/.cache/pip \
296296RUN --mount=type=cache,target=/root/.cache/pip \
297297 # Update UV
298298 pip install -U uv \
299+ # Compile-install FlashAttention (version paired with xFormers)
300+ && pip install flash-attn==2.8.3 \
299301 # Nunchaku version needs to sync with PyTorch version
300302 && pip install \
301303https://github.com/nunchaku-tech/nunchaku/releases/download/v1.0.2/nunchaku-1.0.2+torch2.9-cp312-cp312-linux_x86_64.whl
302304
303- # Notes on FlashAttention:
304- # <xformers 0.0.32.post2> requires <flash-attn [2.7.1, 2.8.2]>,
305- # but flash-attn 2.8.2 does not have binary wheel for PyTorch 2.8.
306- # Now using xformers embbeded flash-attn.
307- # May need to suppress some custom nodes that explicit depends on flash-attn.
308- # Wait for xFormers to update.
309-
310305# ###############################################################################
311306# Bundle ComfyUI in the image
312307
Original file line number Diff line number Diff line change @@ -296,6 +296,8 @@ RUN --mount=type=cache,target=/root/.cache/pip \
296296RUN --mount=type=cache,target=/root/.cache/pip \
297297 # Update UV
298298 pip install -U uv \
299+ # Compile-install FlashAttention (version paired with xFormers)
300+ && pip install flash-attn==2.8.3 \
299301 # Nunchaku version needs to sync with PyTorch version
300302 && pip install \
301303https://github.com/nunchaku-tech/nunchaku/releases/download/v1.0.2/nunchaku-1.0.2+torch2.9-cp312-cp312-linux_x86_64.whl
You can’t perform that action at this time.
0 commit comments