Skip to content

Error Inferring Yolov8 node_args.cpp Vitis Ai provider  #17534

Open
@farah-rostom98

Description

Describe the issue

Hello,

I had a quantized Yolov8 model that I wanted to infer on a Kria-KV260 target with DPUCZDX8G-B409 with Vitis AI option. During quantization, I excluded the nodes that aren't supported by the DPU architecture like: reshape, resize, slice, split, divide, and subtract. The model runs with onnx run time successfully on my local laptop on CPU. However, when I tried to run it on the target, there was an error -> node_arg.cpp:329, unknown type:2, check failure stack trace, Aborted. I changed the logging to be verbose so that I could have more info and I am attaching the output of the log. I tried to look what would be inside node_arg.cpp file to debug but couldn't find any clues.

Would appreciate if you can help debugging this. I have attached the quantized model and inference script as well for your reference.

MicrosoftTeams-image (1)

MicrosoftTeams-image (2)

To reproduce

1- Move Quantized model to the kit
2 - Used the following script to create a session

Yolov8.zip

Urgency

I am currently working on a project trying to assess the performance of FPGAs vs GPUs and this issue is blocking me as this task aim was to estimate how can we scale the FPGA vs the GPU.

Platform

Linux

OS Version

Peta-Linux, Vitis3.0

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.14.0

ONNX Runtime API

C++

Architecture

ARM64

Execution Provider

Vitis AI

Execution Provider Library Version

No response

Metadata

Assignees

No one assigned

    Labels

    ep:VitisAIissues related to Vitis AI execution providerquantizationissues related to quantizationstaleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions