-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Open
Labels
staleissues that have not been addressed in a while; categorized by a botissues that have not been addressed in a while; categorized by a bot
Description
Describe the issue
Hi,
I am using onnxruntime to support ONNX model inference runs in my project. My project can be installed as a library; I aim to support environments with both CPU and GPU capabilities. When I install both onnxruntime and onnxruntime-gpu packages, I can not use the CUDAExecutionProvider.
Specified provider ‘CUDAExecutionProvider’ is not in available provider names. Available providers: ‘AzureExecutionProvider, CPUExecutionProvider’
It would be great if these packages could coexist without falling back to CPUExecutionProvider.
To reproduce
Install both onnxruntime and onnxruntime-gpu packages on a GPU device. I am using a Tesla T4 GPU.
import onnxruntime
model_path = "your_onnx_model_name.onnx"
providers = ["CUDAExecutionProvider"]
self.session = onnxruntime.InferenceSession(
model_path,
providers=providers,
)
This should raise the error I described above.
Urgency
Not urgent.
Platform
Linux
OS Version
Linux-6.6.105+-x86_64-with-glibc2.39
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.23.2
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
CUDA 13.0
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
staleissues that have not been addressed in a while; categorized by a botissues that have not been addressed in a while; categorized by a bot