Skip to content

[Training] Support for RKNPU Execution Provider on RK3562 Platform and On-Device Training Capabilities #21060

Open
@Leo5050xvjf

Description

@Leo5050xvjf

Describe the issue

Hi,
I have been reading the documentation for ONNX Runtime and came across the RKNPU Execution Provider (EP). I would like to clarify my understanding:

It appears that the RKNPU EP currently only supports the RK1808 Linux platform. Does this mean I cannot use the RKNPU EP on my RK3562 platform?
If the answer to the first question is yes, can I still perform ONNX model inference using the CPU on the RK3562?
To further clarify my requirements:

I intend to use ONNX Runtime with the RKNPU EP for model inference on the RK3562 platform.
I also plan to use ONNX Runtime's On-Device Training feature to train models on the RK3562 and would like to use the RKNPU EP for inference with the trained models.
Does the current version of ONNX Runtime support these requirements? Thank you for your clarification

To reproduce

No

Urgency

No response

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.18

PyTorch Version

1.7

Execution Provider

Other / Unknown

Execution Provider Library Version

RKNN EP

Metadata

Metadata

Assignees

No one assigned

    Labels

    ep:RockchipNPUissues related to Rockchip execution providerstaleissues that have not been addressed in a while; categorized by a bottrainingissues related to ONNX Runtime training; typically submitted using template

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions