-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Open
Description
Describe the bug
版本兼容问题
[rank1]: File "/opt/conda/envs/megatron/lib/python3.10/site-packages/swift/trainers/sequence_parallel/ulysses.py", line 185, in init_sequence_parallel
[rank1]: from transformers.modeling_flash_attention_utils import is_flash_attn_available
[rank1]: ImportError: cannot import name 'is_flash_attn_available' from 'transformers.modeling_flash_attention_utils'
Your hardware and system info
H800,序列并行时,开启flash_attention报错。
transfomers版本:4.50.0
flash_attn: 2.7.4.post1
Metadata
Metadata
Assignees
Labels
No labels