[Bugfix] Fix Wan2.2 cross-attention with Ulysses Sequence Parallelism (USP)#1233
Open
lishunyang12 wants to merge 1 commit intovllm-project:mainfrom
Open
[Bugfix] Fix Wan2.2 cross-attention with Ulysses Sequence Parallelism (USP)#1233lishunyang12 wants to merge 1 commit intovllm-project:mainfrom
lishunyang12 wants to merge 1 commit intovllm-project:mainfrom
Conversation
Signed-off-by: lishunyang <lishunyang12@163.com>
Contributor
Author
|
@gcanlin PTAL |
Collaborator
|
@wtomin PTAL |
Contributor
|
It doesn't work. The same error. |
Contributor
|
For SP failure in ti2v, I fixed it in #1221. |
Contributor
Author
Can help close this pr as the target issue has been solved. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
ulysses_degree >= 2by correctly handling cross-attention under Ulysses Sequence Parallelismskip_parallelparameter toAttention.forward()to allow callers to bypass built-in parallel communication when managing it externallyFixes #1219
Root Cause
When Ulysses SP is enabled, every
Attentionobject applies AllToAll communication to all of Q, K, V. This is correct for self-attention (where Q/K/V all come from SP-splithidden_states), but incorrect for cross-attention:hidden_stateswhich IS split across SP ranks → AllToAll correctly reconstructs the full sequenceencoder_hidden_stateswhich is replicated (NOT split) across SP ranks → AllToAll incorrectly duplicates the encoder context P times ([B, T, H, D]→[B, T*P, H/P, D])This produces incorrect attention results and causes failures.
Fix
In
WanCrossAttention.forward(), when Ulysses SP is active:[B, S/P, H, D]→[B, S, H/P, D](correct sequence reconstruction)[B, T, H, D]→[B, T, H/P, D](match Q's head count without duplicating context)skip_parallel=True[B, S, H/P, D]→[B, S/P, H, D]This follows the same pattern used by other models (Qwen Image, LongCat) where replicated encoder context is head-sliced rather than AllToAll'ed.
Test plan
ulysses_degree=2(the failing case from [Bug]: Wan2.2 TI2V USP=2 failed #1219)