Skip to content

[CUDA] Support FP8 (E4M3) KV Cache for Group Query Attention #45221

[CUDA] Support FP8 (E4M3) KV Cache for Group Query Attention

[CUDA] Support FP8 (E4M3) KV Cache for Group Query Attention #45221

Annotations

2 warnings

Analyze (javascript)

succeeded Feb 14, 2026 in 1m 48s