Skip to content

Fix a bug in flash attention where kv_seq_len should divide block_k_m… #10997

Fix a bug in flash attention where kv_seq_len should divide block_k_m…

Fix a bug in flash attention where kv_seq_len should divide block_k_m… #10997

Build PyTorch/XLA  /  build

succeeded Feb 10, 2025 in 34m 20s