Skip to content

[Fmha] Add head_dim=512 support for trtllm attention kernels #2757

[Fmha] Add head_dim=512 support for trtllm attention kernels

[Fmha] Add head_dim=512 support for trtllm attention kernels #2757

Triggered via pull request April 21, 2026 17:29
Status Success
Total duration 8s
Artifacts

claude-code-review.yml

on: pull_request
claude-review
3s
claude-review
Fit to window
Zoom out
Zoom in