Skip to content

When using the int8 quantization model to convert to onnx, an error occurs during runtime #6473

When using the int8 quantization model to convert to onnx, an error occurs during runtime

When using the int8 quantization model to convert to onnx, an error occurs during runtime #6473

Workflow file for this run

name: "Issue Labeler"
on:
issues:
types: [opened, edited]
permissions:
issues: write
jobs:
triage:
runs-on: ["self-hosted", "1ES.Pool=onnxruntime-github-Ubuntu2204-AMD-CPU"]
steps:
- uses: github/[email protected]
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
configuration-path: .github/labeler.yml
not-before: 2020-01-15T02:54:32Z
enable-versioned-regex: 0
include-title: 1