Skip to content

[Feature Request] Support optrace profiling with QNNExecutionProvider in the published onnxruntime-qnn wheels #27060

@martinResearch

Description

@martinResearch

Describe the feature request

It appears that the latest onnxruntime-qnn package on pypi.org (version 1.23.2, released in October 2025) does not support optrace-level profiling when using the QNNExecutionProvider. Is this to be expected?

In the QNNExecutionProvider documentation optrace-level-profiling, it states, "This feature is only available with the QAIRT 2.39 SDK and later." However, it does not specify which version was used to compile the published onnxruntime-qnn Python wheels, whether optrace-level profiling is supported in the published onnxruntime-qnn package, or, if so, from which version. Explicitly providing this information in the documentation would be very helpful.

Thank you!

Describe scenario use case

I want to profile my models on NPU and get detailed logs I can visualize in perfetto.

Metadata

Metadata

Assignees

No one assigned

    Labels

    ep:QNNissues related to QNN exeution providerfeature requestrequest for unsupported feature or enhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions