Skip to content

Commit 3d99550

Browse files
committed
{megapaks} Add FlashAttention (again)
#144
1 parent 0d56ef4 commit 3d99550

File tree

2 files changed

+4
-7
lines changed

2 files changed

+4
-7
lines changed

cu126-megapak/Dockerfile

Lines changed: 2 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -296,17 +296,12 @@ RUN --mount=type=cache,target=/root/.cache/pip \
296296
RUN --mount=type=cache,target=/root/.cache/pip \
297297
# Update UV
298298
pip install -U uv \
299+
# Compile-install FlashAttention (version paired with xFormers)
300+
&& pip install flash-attn==2.8.3 \
299301
# Nunchaku version needs to sync with PyTorch version
300302
&& pip install \
301303
https://github.com/nunchaku-tech/nunchaku/releases/download/v1.0.2/nunchaku-1.0.2+torch2.9-cp312-cp312-linux_x86_64.whl
302304

303-
# Notes on FlashAttention:
304-
# <xformers 0.0.32.post2> requires <flash-attn [2.7.1, 2.8.2]>,
305-
# but flash-attn 2.8.2 does not have binary wheel for PyTorch 2.8.
306-
# Now using xformers embbeded flash-attn.
307-
# May need to suppress some custom nodes that explicit depends on flash-attn.
308-
# Wait for xFormers to update.
309-
310305
################################################################################
311306
# Bundle ComfyUI in the image
312307

cu128-megapak/Dockerfile

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -296,6 +296,8 @@ RUN --mount=type=cache,target=/root/.cache/pip \
296296
RUN --mount=type=cache,target=/root/.cache/pip \
297297
# Update UV
298298
pip install -U uv \
299+
# Compile-install FlashAttention (version paired with xFormers)
300+
&& pip install flash-attn==2.8.3 \
299301
# Nunchaku version needs to sync with PyTorch version
300302
&& pip install \
301303
https://github.com/nunchaku-tech/nunchaku/releases/download/v1.0.2/nunchaku-1.0.2+torch2.9-cp312-cp312-linux_x86_64.whl

0 commit comments

Comments
 (0)