Skip to content

Add 5D support for flash_attention #10994

Add 5D support for flash_attention

Add 5D support for flash_attention #10994

Triggered via pull request February 9, 2025 17:38
Status Failure
Total duration 53m 19s
Artifacts 3

build_and_test.yml

on: pull_request
get-torch-commit
2s
get-torch-commit
Build PyTorch/XLA  /  build
34m 24s
Build PyTorch/XLA / build
Matrix: CPU tests / test
Fit to window
Zoom out
Zoom in

Annotations

1 error and 1 warning
TPU tests / tpu-test
Process completed with exit code 1.
TPU tests / tpu-test
This job failure may be caused by using an out of date self-hosted runner. You are currently using runner version 2.321.0. Please update to the latest version 2.322.0

Artifacts

Produced during runtime
Name Size
cpp-test-bin
672 MB
github-pages
5.69 MB
torch-xla-wheels
225 MB