Skip to content

Add 5D support for flash_attention #12117

Add 5D support for flash_attention

Add 5D support for flash_attention #12117

Triggered via pull request February 9, 2025 17:38
Status Failure
Total duration 1m 22s
Artifacts

lintercheck.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

1 error
linter_check
Process completed with exit code 1.