Skip to content

Fix a bug in flash attention where kv_seq_len should divide block_k_m… #10997

Fix a bug in flash attention where kv_seq_len should divide block_k_m…

Fix a bug in flash attention where kv_seq_len should divide block_k_m… #10997

CPU tests  /  test (python_tests, torch_mp_op)

succeeded Feb 10, 2025 in 13m 34s