Skip to content

Actions: pytorch/ao

Build Docs

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
3,560 workflow runs
3,560 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

float8 training axiswise scaling support with per-gemm-argument configuration
Build Docs #3774: Pull request #940 synchronize by vkuzo
October 5, 2024 20:28 3m 32s gh/vkuzo/12/head
October 5, 2024 20:28 3m 32s
[wip] make scaling configurable by gemm-argument
Build Docs #3773: Commit 371775a pushed by vkuzo
October 5, 2024 20:28 2m 59s gh/vkuzo/12/orig
October 5, 2024 20:28 2m 59s
add axiswise scaling to Float8Linear
Build Docs #3772: Commit 3f24d79 pushed by vkuzo
October 5, 2024 20:28 2m 52s gh/vkuzo/11/orig
October 5, 2024 20:28 2m 52s
add axiswise granularity to Float8Tensor
Build Docs #3771: Commit afa36c8 pushed by vkuzo
October 5, 2024 20:28 3m 4s gh/vkuzo/10/orig
October 5, 2024 20:28 3m 4s
Update
Build Docs #3770: Commit 712fd5d pushed by vkuzo
October 5, 2024 20:28 2m 47s gh/vkuzo/12/next
October 5, 2024 20:28 2m 47s
Update
Build Docs #3769: Commit 712fd5d pushed by vkuzo
October 5, 2024 20:28 2m 34s gh/vkuzo/12/head
October 5, 2024 20:28 2m 34s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3768: Pull request #743 synchronize by vayuda
October 5, 2024 16:19 3m 39s vayuda:awq
October 5, 2024 16:19 3m 39s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3766: Pull request #743 synchronize by vayuda
October 5, 2024 05:26 10m 57s vayuda:awq
October 5, 2024 05:26 10m 57s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3765: Pull request #743 synchronize by vayuda
October 5, 2024 05:10 3m 37s vayuda:awq
October 5, 2024 05:10 3m 37s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3764: Pull request #743 synchronize by vayuda
October 5, 2024 05:10 50s vayuda:awq
October 5, 2024 05:10 50s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3763: Pull request #743 synchronize by vayuda
October 5, 2024 05:09 25s vayuda:awq
October 5, 2024 05:09 25s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3762: Pull request #743 synchronize by vayuda
October 5, 2024 05:06 3m 41s vayuda:awq
October 5, 2024 05:06 3m 41s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3761: Pull request #743 synchronize by vayuda
October 5, 2024 05:03 2m 26s vayuda:awq
October 5, 2024 05:03 2m 26s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3760: Pull request #743 synchronize by vayuda
October 5, 2024 05:00 3m 30s vayuda:awq
October 5, 2024 05:00 3m 30s
[WIP] Activation Aware Weight Quantization (AWQ)
Build Docs #3759: Pull request #743 synchronize by vayuda
October 5, 2024 04:54 3m 33s vayuda:awq
October 5, 2024 04:54 3m 33s
Make module swap the main QAT flow again
Build Docs #3757: Commit 3deb592 pushed by andrewor14
October 4, 2024 22:52 2m 47s gh/andrewor14/3/orig
October 4, 2024 22:52 2m 47s
Add generic fake quantized linear for QAT
Build Docs #3756: Commit 77b7657 pushed by andrewor14
October 4, 2024 22:52 2m 54s gh/andrewor14/4/orig
October 4, 2024 22:52 2m 54s
Add generic fake quantized linear for QAT
Build Docs #3755: Pull request #1020 opened by andrewor14
October 4, 2024 22:52 3m 26s gh/andrewor14/4/head
October 4, 2024 22:52 3m 26s
Add generic fake quantized linear for QAT
Build Docs #3754: Commit 9e9fdef pushed by andrewor14
October 4, 2024 22:52 2m 47s gh/andrewor14/4/head
October 4, 2024 22:52 2m 47s
Make module swap the main QAT flow again
Build Docs #3753: Commit 0756f39 pushed by andrewor14
October 4, 2024 22:52 2m 57s gh/andrewor14/4/base
October 4, 2024 22:52 2m 57s
Make module swap the main QAT flow again
Build Docs #3752: Pull request #1019 opened by andrewor14
October 4, 2024 22:52 3m 34s gh/andrewor14/3/head
October 4, 2024 22:52 3m 34s
Make module swap the main QAT flow again
Build Docs #3751: Commit 0756f39 pushed by andrewor14
October 4, 2024 22:52 2m 50s gh/andrewor14/3/head
October 4, 2024 22:52 2m 50s