Skip to content

add int8 quantization support for llm models #2170

add int8 quantization support for llm models

add int8 quantization support for llm models #2170

Annotations

2 warnings

L0 torchscript tests  /  L0-torchscript-tests--3.10-cu130

succeeded Feb 20, 2026 in 20m 5s