Skip to content

Fix a bug in flash attention where kv_seq_len should divide block_k_major. #10981

Fix a bug in flash attention where kv_seq_len should divide block_k_major.

Fix a bug in flash attention where kv_seq_len should divide block_k_major. #10981

Build PyTorch/XLA  /  build

succeeded Feb 7, 2025 in 1h 10m 44s