Skip to content

Actions: pytorch/xla

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
19,664 workflow runs
19,664 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Fix CUDA plugin CI.
Build and test #11002: Pull request #8593 synchronize by ysiraichi
February 11, 2025 15:58 2h 21m 6s fix-cuda-plugin-compilation
February 11, 2025 15:58 2h 21m 6s
Fix CUDA plugin CI.
Linter check #12124: Pull request #8593 synchronize by ysiraichi
February 11, 2025 15:25 1m 25s fix-cuda-plugin-compilation
February 11, 2025 15:25 1m 25s
Fix CUDA plugin CI.
Build and test #11001: Pull request #8593 synchronize by ysiraichi
February 11, 2025 15:25 34m 19s fix-cuda-plugin-compilation
February 11, 2025 15:25 34m 19s
Fix CUDA plugin CI.
Build and test #11000: Pull request #8593 synchronize by ysiraichi
February 11, 2025 14:27 58m 54s fix-cuda-plugin-compilation
February 11, 2025 14:27 58m 54s
Fix CUDA plugin CI.
Linter check #12123: Pull request #8593 synchronize by ysiraichi
February 11, 2025 14:27 1m 23s fix-cuda-plugin-compilation
February 11, 2025 14:27 1m 23s
Support splitting physical axis in SPMD mesh
Build and test #10999: Pull request #8698 opened by tengyifei
February 11, 2025 02:13 1h 44m 22s yifeit/split-physical-axis
February 11, 2025 02:13 1h 44m 22s
Support splitting physical axis in SPMD mesh
Linter check #12122: Pull request #8698 opened by tengyifei
February 11, 2025 02:13 1m 24s yifeit/split-physical-axis
February 11, 2025 02:13 1m 24s
pages build and deployment
pages-build-deployment #1374: by torchxlabot2
February 11, 2025 00:40 39s gh-pages
February 11, 2025 00:40 39s
Add missing NeuronPlugin configure_single_process function (#8694)
Build and test #10998: Commit 065cb5b pushed by jeffhataws
February 11, 2025 00:04 1h 45m 5s master
February 11, 2025 00:04 1h 45m 5s
Add missing NeuronPlugin configure_single_process function (#8694)
Build upstream image #630: Commit 065cb5b pushed by jeffhataws
February 11, 2025 00:04 10m 17s master
February 11, 2025 00:04 10m 17s
Add missing NeuronPlugin configure_single_process function (#8694)
Linter check #12121: Commit 065cb5b pushed by jeffhataws
February 11, 2025 00:04 1m 17s master
February 11, 2025 00:04 1m 17s
pages build and deployment
pages-build-deployment #1373: by torchxlabot2
February 10, 2025 22:18 39s gh-pages
February 10, 2025 22:18 39s
Fix a bug in flash attention where kv_seq_len should divide block_k_m…
Build and test #10997: Commit cff9f4e pushed by zpcore
February 10, 2025 21:42 1h 47m 23s master
February 10, 2025 21:42 1h 47m 23s
Fix a bug in flash attention where kv_seq_len should divide block_k_m…
Build upstream image #629: Commit cff9f4e pushed by zpcore
February 10, 2025 21:42 10m 27s master
February 10, 2025 21:42 10m 27s
Fix a bug in flash attention where kv_seq_len should divide block_k_m…
Linter check #12120: Commit cff9f4e pushed by zpcore
February 10, 2025 21:42 1m 20s master
February 10, 2025 21:42 1m 20s
Split page indices in the ragged paged attention.
Linter check #12119: Pull request #8688 synchronize by vanbasten23
February 10, 2025 18:54 1m 20s xiowei/split_page_indices
February 10, 2025 18:54 1m 20s
Split page indices in the ragged paged attention.
Build and test #10996: Pull request #8688 synchronize by vanbasten23
February 10, 2025 18:54 1h 44m 33s xiowei/split_page_indices
February 10, 2025 18:54 1h 44m 33s
Add 5D support for flash_attention
Build and test #10994: Pull request #8693 opened by qihqi
February 9, 2025 17:38 53m 19s hanq_5d_flash_attention
February 9, 2025 17:38 53m 19s
Add 5D support for flash_attention
Linter check #12117: Pull request #8693 opened by qihqi
February 9, 2025 17:38 1m 22s hanq_5d_flash_attention
February 9, 2025 17:38 1m 22s
Integrate ragged paged attn to pytorch/xla
Build and test #10993: Pull request #8692 synchronize by bythew3i
February 9, 2025 03:32 2h 21m 29s bythew3i:integrate-ragged-paged-attn
February 9, 2025 03:32 2h 21m 29s
Integrate ragged paged attn to pytorch/xla
Linter check #12116: Pull request #8692 synchronize by bythew3i
February 9, 2025 03:32 1m 26s bythew3i:integrate-ragged-paged-attn
February 9, 2025 03:32 1m 26s
Fix a bug in flash attention where kv_seq_len should divide block_k_major.
Linter check #12112: Pull request #8671 synchronize by zhangp365
February 8, 2025 12:55 1m 24s zhangp365:master
February 8, 2025 12:55 1m 24s
Fix a bug in flash attention where kv_seq_len should divide block_k_major.
Build and test #10989: Pull request #8671 synchronize by zhangp365
February 8, 2025 12:55 2h 25m 34s zhangp365:master
February 8, 2025 12:55 2h 25m 34s