Skip to content

Why is the console messed up when using onnxruntime.InferenceSession? #23270

@Septemberlemon

Description

@Septemberlemon

Describe the issue

I've installed onnxruntim-gpu with pip, then I run the code like:

import onnxruntime

sess = onnxruntime.InferenceSession("checkpoint1.onnx", providers=["CUDAExecutionProvider"])

Then the console comes up with a bunch of gibberish, what caused that and how can I solve it?
Image

To reproduce

I use python 3.11.3, just run the code above, then I'll meet the issue

Urgency

No response

Platform

Windows

OS Version

win10 profession 22h2

ONNX Runtime Installation

Other / Unknown

ONNX Runtime Version or Commit ID

onnxruntime-gpu 1.20.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

CUDA

Execution Provider Library Version

CUDA 12.6

Metadata

Metadata

Assignees

No one assigned

    Labels

    staleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions