Open
Description
Are you planning to wrap the _flash_attn_forward and _flash_attn_backward functions in a format similar to the one in https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/flash_attn_triton.py?
Metadata
Assignees
Labels
No labels
Activity