Skip to content

Conversation

@alexsikomas
Copy link
Contributor

Summary

Adds a new documentation file BUILDING-CUDA-LINUX.md detailing how to compile and install ONNXRuntime for newer versions of CUDA, and then how to use this for obs-backgroundremoval.

Motivation

I couldn't find any direct guides on compiling ONNXRuntime and getting it to work with obs-backgroundremoval so I had to go through a lot of headache to figure it out. I wanted to share what I've learned, feel free to expand on it.

Verification

I successfully compiled ONNXRuntime and subsequently obs-backgroundremoval using these steps on:

  • OS: Arch Linux 6.17.7-arch1-2
  • CUDA: 13.0
  • GPU: RTX 3060

Issues

It can take a while to load a model onto the GPU, this is only surprising because the same does not occur with CPU inference. Not sure if this is an issue or the expected behaviour.

RMBG and RobustVideoMatting both give errors, other models work as expected though.

This is the error RobustVideoMatting gives:

2025-11-17 14:18:11.562711140 [E:onnxruntime:background-removal-inference, tensorrt_execution_provider.h:90 log] [2025-11-17 14:18:11   ERROR] [graphShapeAnalyzer.cpp::checkCalculationStatusSanity::2126] Error Code 2: Internal Error (Assertion !isInFlight(p.second.symbolicRep) failed.  In checkCalculationStatusSanity at /_src/optimizer/shapeof/graphShapeAnalyzer.cpp:2126)
2025-11-17 14:18:11.562769910 [E:onnxruntime:background-removal-inference, tensorrt_execution_provider.h:90 log] [2025-11-17 14:18:11   ERROR] ModelImporter.cpp:135: While parsing node number 2 [Resize -> "389"]:
2025-11-17 14:18:11.564561504 [E:onnxruntime:background-removal-inference, tensorrt_execution_provider.h:90 log] [2025-11-17 14:18:11   ERROR] ModelImporter.cpp:138: --- Begin node ---
input: "src"
input: "386"
input: "388"
output: "389"
name: "Resize_3"
op_type: "Resize"
attribute {
  name: "extrapolation_value"
  f: 0
  type: FLOAT
}
attribute {
  name: "cubic_coeff_a"
  f: -0.75
  type: FLOAT
}
attribute {
  name: "nearest_mode"
  s: "floor"
  type: STRING
}
attribute {
  name: "mode"
  s: "linear"
  type: STRING
}
attribute {
  name: "coordinate_transformation_mode"
  s: "pytorch_half_pixel"
  type: STRING
}
attribute {
  name: "exclude_outside"
  i: 0
  type: INT
}

2025-11-17 14:18:11.564591405 [E:onnxruntime:background-removal-inference, tensorrt_execution_provider.h:90 log] [2025-11-17 14:18:11   ERROR] ModelImporter.cpp:139: --- End node ---
2025-11-17 14:18:11.564602365 [E:onnxruntime:background-removal-inference, tensorrt_execution_provider.h:90 log] [2025-11-17 14:18:11   ERROR] ModelImporter.cpp:141: ERROR: ModelImporter.cpp:368 In function parseNode:
[6] Invalid Node - Resize_3
[graphShapeAnalyzer.cpp::checkCalculationStatusSanity::2126] Error Code 2: Internal Error (Assertion !isInFlight(p.second.symbolicRep) failed.  In checkCalculationStatusSanity at /_src/optimizer/shapeof/graphShapeAnalyzer.cpp:2126)
2025-11-17 14:18:11.582141284 [E:onnxruntime:, inference_session.cc:2521 operator()] Exception during initialization: /home/alex/Documents/Applications/libs/onnxruntime/onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc:2318 SubGraphCollection_t onnxruntime::TensorrtExecutionProvider::GetSupportedList(SubGraphCollection_t, int, int, const onnxruntime::GraphViewer&, bool*) const [ONNXRuntimeError] : 1 : FAIL : TensorRT input: 389 has no shape specified. Please run shape inference on the onnx model first. Details can be found in https://onnxruntime.ai/docs/execution-providers/TensorRT-ExecutionProvider.html#shape-inference-for-tensorrt-subgraphs

error: [obs-backgroundremoval] Exception during initialization: /home/alex/Documents/Applications/libs/onnxruntime/onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc:2318 SubGraphCollection_t onnxruntime::TensorrtExecutionProvider::GetSupportedList(SubGraphCollection_t, int, int, const onnxruntime::GraphViewer&, bool*) const [ONNXRuntimeError] : 1 : FAIL : TensorRT input: 389 has no shape specified. Please run shape inference on the onnx model first. Details can be found in https://onnxruntime.ai/docs/execution-providers/TensorRT-ExecutionProvider.html#shape-inference-for-tensorrt-subgraphs

alexsikomas and others added 3 commits November 17, 2025 14:04
Without vcpkg linking ONNXRuntime to the plugin can cause errors where dependencies need to be linked. 

The vcpkg flag fixes this issue.
@umireon umireon requested a review from royshil November 17, 2025 21:57
@umireon
Copy link
Collaborator

umireon commented Nov 17, 2025

Thanks for the wonderful documentation!

@royshil royshil merged commit 2694ffd into royshil:main Nov 17, 2025
9 checks passed
@umireon umireon mentioned this pull request Nov 21, 2025
sobalap pushed a commit to sobalap/obs-backgroundremoval that referenced this pull request Jan 7, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants