Skip to content

[Fmha] Add head_dim=512 support for trtllm attention kernels #2065

[Fmha] Add head_dim=512 support for trtllm attention kernels

[Fmha] Add head_dim=512 support for trtllm attention kernels #2065

Job log options

This job was skipped