Skip to content

Extend ONNX FE with Col2Im operator#33386

Open
daehyun99 wants to merge 3 commits intoopenvinotoolkit:masterfrom
daehyun99:ENH/30144-1
Open

Extend ONNX FE with Col2Im operator#33386
daehyun99 wants to merge 3 commits intoopenvinotoolkit:masterfrom
daehyun99:ENH/30144-1

Conversation

@daehyun99
Copy link
Contributor

@daehyun99 daehyun99 commented Dec 29, 2025

Details:

  • Extend ONNX FE with Col2Im operator
  • Unblocked python tests in src\frontends\onnx\tests\tests_python\test_backend.py
  • Add test code and prototxt model for testing Col2Im

Issue number(s) that this pull request fixes

Tickets:

@github-actions github-actions bot added the category: ONNX FE OpenVINO ONNX FrontEnd label Dec 29, 2025
@sys-openvino-ci sys-openvino-ci added the ExternalPR External contributor label Dec 29, 2025
@daehyun99
Copy link
Contributor Author

Hi all,
I’m planning to request a formal review after adding the test code and prototxt files.

In the meantime, feel free to take a quick look and let me know if there’s anything I might have missed or if you have any suggestions for improvement.

Thank you.

@daehyun99 daehyun99 marked this pull request as ready for review January 4, 2026 17:17
@daehyun99 daehyun99 requested a review from a team as a code owner January 4, 2026 17:17
@daehyun99
Copy link
Contributor Author

daehyun99 commented Jan 4, 2026

Hi maintainers, I have completed the task and included the test codes.

As the INTERPRETER tests have now passed, I believe the Frontend implementation is working correctly.

Regarding the IE_CPU tests, I have confirmed an error in the Col2Im logic.
Since this issue appears to be independent of the Frontend, I resolved it by modifying the code in Col2Im::executeDynamicImpl via Issue #33472 and PR #33473.

Please note that the IE_GPU test is showing as failed because I ran the tests in a Colab environment without an Intel GPU.

Please review the code when you have a moment.

Thanks.

  • Test Result Log

!/content/openvino/bin/intel64/Debug/ov_onnx_frontend_tests --gtest_filter=col2im

Running main() from /content/openvino/src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *col2im*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 18 tests from 3 test suites.
[----------] Global test environment set-up.
[----------] 6 tests from INTERPRETER
[ RUN      ] INTERPRETER.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_default (68 ms)
[ RUN      ] INTERPRETER.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_dilations (53 ms)
[ RUN      ] INTERPRETER.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_pads (52 ms)
[ RUN      ] INTERPRETER.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_strides (51 ms)
[ RUN      ] INTERPRETER.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_batch (55 ms)
[ RUN      ] INTERPRETER.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_channel (55 ms)
[----------] 6 tests from INTERPRETER (334 ms total)

@daehyun99
Copy link
Contributor Author

Hmm, Should I apply clang-format to the test code?
I think that it reduces code readability.

daehyun99 added a commit to daehyun99/openvino that referenced this pull request Jan 15, 2026
…nvinotoolkit#33473)

### Details:
 - Add Dynamic logic of `Col2Im`
- Please note that `IE_GPU` tests were not conducted as I am working in
a Colab environment.
- This PR is related on openvinotoolkit#33386

### Tickets:
 - Fixes openvinotoolkit#33472

<details>
<summary> Test Result Logs </summary>

> !/content/openvino/bin/intel64/Debug/ov_onnx_frontend_tests
--gtest_filter=*col2im*

```sh
Running main() from /content/openvino/src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *col2im*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 18 tests from 3 test suites.
[----------] Global test environment set-up.
[----------] 6 tests from INTERPRETER
[ RUN      ] INTERPRETER.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_default (68 ms)
[ RUN      ] INTERPRETER.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_dilations (53 ms)
[ RUN      ] INTERPRETER.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_pads (52 ms)
[ RUN      ] INTERPRETER.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_strides (51 ms)
[ RUN      ] INTERPRETER.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_batch (55 ms)
[ RUN      ] INTERPRETER.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_channel (55 ms)
[----------] 6 tests from INTERPRETER (334 ms total)

[----------] 6 tests from IE_GPU
[ RUN      ] IE_GPU.onnx_col2im_default
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_default (33 ms)
[ RUN      ] IE_GPU.onnx_col2im_dilations
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_dilations (8 ms)
[ RUN      ] IE_GPU.onnx_col2im_pads
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_pads (8 ms)
[ RUN      ] IE_GPU.onnx_col2im_strides
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_strides (10 ms)
[ RUN      ] IE_GPU.onnx_col2im_batch
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_batch (7 ms)
[ RUN      ] IE_GPU.onnx_col2im_channel
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_channel (8 ms)
[----------] 6 tests from IE_GPU (74 ms total)

[----------] 6 tests from IE_CPU
[ RUN      ] IE_CPU.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_default (109 ms)
[ RUN      ] IE_CPU.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_dilations (106 ms)
[ RUN      ] IE_CPU.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_pads (105 ms)
[ RUN      ] IE_CPU.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_strides (105 ms)
[ RUN      ] IE_CPU.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_batch (110 ms)
[ RUN      ] IE_CPU.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_channel (107 ms)
[----------] 6 tests from IE_CPU (643 ms total)

[----------] Global test environment tear-down
[==========] 18 tests from 3 test suites ran. (1051 ms total)
[  PASSED  ] 12 tests.
[  FAILED  ] 6 tests, listed below:
[  FAILED  ] IE_GPU.onnx_col2im_default
[  FAILED  ] IE_GPU.onnx_col2im_dilations
[  FAILED  ] IE_GPU.onnx_col2im_pads
[  FAILED  ] IE_GPU.onnx_col2im_strides
[  FAILED  ] IE_GPU.onnx_col2im_batch
[  FAILED  ] IE_GPU.onnx_col2im_channel

 6 FAILED TESTS
```

</details>

---------

Co-authored-by: Maksim Kutakov <maksim.kutakov@intel.com>
@mlukasze mlukasze added this to the 2026.0 milestone Jan 16, 2026
@daehyun99
Copy link
Contributor Author

Hi, devs
Passed the unit test of Col2Im's ONNX front in Colab envs. (Please note that IE_GPU tests were not conducted as I am working in a Colab environment.)

I will solve CI unit test error and clang-format error.

Thanks.

Details

!/content/openvino/bin/intel64/Debug/ov_onnx_frontend_tests --gtest_filter=col2im

Running main() from /content/openvino/src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *col2im*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 18 tests from 3 test suites.
[----------] Global test environment set-up.
[----------] 6 tests from INTERPRETER
[ RUN      ] INTERPRETER.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_default (174 ms)
[ RUN      ] INTERPRETER.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_dilations (48 ms)
[ RUN      ] INTERPRETER.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_pads (44 ms)
[ RUN      ] INTERPRETER.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_strides (42 ms)
[ RUN      ] INTERPRETER.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_batch (43 ms)
[ RUN      ] INTERPRETER.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_channel (43 ms)
[----------] 6 tests from INTERPRETER (395 ms total)

[----------] 6 tests from IE_GPU
[ RUN      ] IE_GPU.onnx_col2im_default
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_default (21 ms)
[ RUN      ] IE_GPU.onnx_col2im_dilations
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_dilations (7 ms)
[ RUN      ] IE_GPU.onnx_col2im_pads
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_pads (6 ms)
[ RUN      ] IE_GPU.onnx_col2im_strides
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_strides (6 ms)
[ RUN      ] IE_GPU.onnx_col2im_batch
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_batch (7 ms)
[ RUN      ] IE_GPU.onnx_col2im_channel
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:110:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_channel (6 ms)
[----------] 6 tests from IE_GPU (53 ms total)

[----------] 6 tests from IE_CPU
[ RUN      ] IE_CPU.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_default (249 ms)
[ RUN      ] IE_CPU.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_dilations (92 ms)
[ RUN      ] IE_CPU.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_pads (94 ms)
[ RUN      ] IE_CPU.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_strides (88 ms)
[ RUN      ] IE_CPU.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_batch (85 ms)
[ RUN      ] IE_CPU.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_channel (89 ms)
[----------] 6 tests from IE_CPU (698 ms total)

[----------] Global test environment tear-down
[==========] 18 tests from 3 test suites ran. (1146 ms total)
[  PASSED  ] 12 tests.
[  FAILED  ] 6 tests, listed below:
[  FAILED  ] IE_GPU.onnx_col2im_default
[  FAILED  ] IE_GPU.onnx_col2im_dilations
[  FAILED  ] IE_GPU.onnx_col2im_pads
[  FAILED  ] IE_GPU.onnx_col2im_strides
[  FAILED  ] IE_GPU.onnx_col2im_batch
[  FAILED  ] IE_GPU.onnx_col2im_channel

 6 FAILED TESTS

@daehyun99 daehyun99 requested a review from a team as a code owner January 18, 2026 09:12
@daehyun99 daehyun99 requested review from tsavina and removed request for a team January 18, 2026 09:12
@github-actions github-actions bot added the category: docs OpenVINO documentation label Jan 18, 2026
@daehyun99
Copy link
Contributor Author

Hi, devs.

I apply clang-format and solve unit test error.

Please review it at your convenience.

Thanks.

  • Full test's log summary(total log is too much long.)

!/content/openvino/bin/intel64/Debug/ov_onnx_frontend_tests --gtest_filter=-IE_GPU

[----------] Global test environment tear-down
[==========] 1754 tests from 27 test suites ran. (130678 ms total)
[  PASSED  ] 1749 tests.
[  SKIPPED ] 5 tests, listed below:
[  SKIPPED ] INTERPRETER.onnx_model_reduce_max_18
[  SKIPPED ] INTERPRETER.onnx_model_reduce_prod_18
[  SKIPPED ] INTERPRETER.onnx_model_reduce_min_18
[  SKIPPED ] INTERPRETER.onnx_model_reduce_min_20_boolean
[  SKIPPED ] ONNX/FrontendLibCloseTest.testUnloadLibBeforeDeletingDependentObject/onnx

@mlukasze mlukasze modified the milestones: 2026.0, 2026.1 Jan 28, 2026
@mlukasze mlukasze requested a review from bumbosiepsak January 28, 2026 07:58
Naseer-010 pushed a commit to Naseer-010/openvino that referenced this pull request Feb 18, 2026
…nvinotoolkit#33473)

### Details:
 - Add Dynamic logic of `Col2Im`
- Please note that `IE_GPU` tests were not conducted as I am working in
a Colab environment.
- This PR is related on openvinotoolkit#33386

### Tickets:
 - Fixes openvinotoolkit#33472

<details>
<summary> Test Result Logs </summary>

> !/content/openvino/bin/intel64/Debug/ov_onnx_frontend_tests
--gtest_filter=*col2im*

```sh
Running main() from /content/openvino/src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *col2im*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 18 tests from 3 test suites.
[----------] Global test environment set-up.
[----------] 6 tests from INTERPRETER
[ RUN      ] INTERPRETER.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_default (68 ms)
[ RUN      ] INTERPRETER.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_dilations (53 ms)
[ RUN      ] INTERPRETER.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_pads (52 ms)
[ RUN      ] INTERPRETER.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_strides (51 ms)
[ RUN      ] INTERPRETER.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_batch (55 ms)
[ RUN      ] INTERPRETER.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] INTERPRETER.onnx_col2im_channel (55 ms)
[----------] 6 tests from INTERPRETER (334 ms total)

[----------] 6 tests from IE_GPU
[ RUN      ] IE_GPU.onnx_col2im_default
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_default (33 ms)
[ RUN      ] IE_GPU.onnx_col2im_dilations
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_dilations (8 ms)
[ RUN      ] IE_GPU.onnx_col2im_pads
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_pads (8 ms)
[ RUN      ] IE_GPU.onnx_col2im_strides
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_strides (10 ms)
[ RUN      ] IE_GPU.onnx_col2im_batch
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_batch (7 ms)
[ RUN      ] IE_GPU.onnx_col2im_channel
unknown file: Failure
C++ exception with description "Exception from src/inference/src/cpp/core.cpp:116:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'contexts.count(device_id)' failed at src/plugins/intel_gpu/src/plugin/plugin.cpp:287:
[GPU] Context was not initialized for 0 device


" thrown in the test body.
[  FAILED  ] IE_GPU.onnx_col2im_channel (8 ms)
[----------] 6 tests from IE_GPU (74 ms total)

[----------] 6 tests from IE_CPU
[ RUN      ] IE_CPU.onnx_col2im_default
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_default (109 ms)
[ RUN      ] IE_CPU.onnx_col2im_dilations
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_dilations (106 ms)
[ RUN      ] IE_CPU.onnx_col2im_pads
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_pads (105 ms)
[ RUN      ] IE_CPU.onnx_col2im_strides
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_strides (105 ms)
[ RUN      ] IE_CPU.onnx_col2im_batch
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_batch (110 ms)
[ RUN      ] IE_CPU.onnx_col2im_channel
[   INFO   ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[       OK ] IE_CPU.onnx_col2im_channel (107 ms)
[----------] 6 tests from IE_CPU (643 ms total)

[----------] Global test environment tear-down
[==========] 18 tests from 3 test suites ran. (1051 ms total)
[  PASSED  ] 12 tests.
[  FAILED  ] 6 tests, listed below:
[  FAILED  ] IE_GPU.onnx_col2im_default
[  FAILED  ] IE_GPU.onnx_col2im_dilations
[  FAILED  ] IE_GPU.onnx_col2im_pads
[  FAILED  ] IE_GPU.onnx_col2im_strides
[  FAILED  ] IE_GPU.onnx_col2im_batch
[  FAILED  ] IE_GPU.onnx_col2im_channel

 6 FAILED TESTS
```

</details>

---------

Co-authored-by: Maksim Kutakov <maksim.kutakov@intel.com>
@mlukasze mlukasze requested a review from Copilot February 25, 2026 08:58
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Extends the OpenVINO ONNX Frontend to support the ONNX Col2Im operator and updates the ONNX FE test suite to validate the new operator support.

Changes:

  • Added ONNX FE translator implementation for Col2Im (opset 18).
  • Updated Python backend expected-fail lists to stop xfail’ing Col2Im tests and removed the obsolete xfail marker.
  • Added multiple C++ importer tests and several prototxt models covering Col2Im attribute/config variations.

Reviewed changes

Copilot reviewed 11 out of 11 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
src/frontends/onnx/tests/tests_python/test_backend.py Removes Col2Im from expected-fail list and updates a different xfail mapping.
src/frontends/onnx/tests/onnx_import.in.cpp Adds importer runtime tests for Col2Im across default/dilations/pads/strides/batch/channel cases.
src/frontends/onnx/tests/models/col2im_2D_*_opset_18.prototxt Adds new Col2Im prototxt models used by importer tests.
src/frontends/onnx/tests/init.py Removes the now-obsolete xfail_issue_99952 marker for unsupported Col2Im.
src/frontends/onnx/frontend/src/op/col2im.cpp Introduces the ONNX FE Col2Im operator mapping.
src/frontends/onnx/docs/supported_ops.md Marks Col2Im as supported in opset 18.

@daehyun99
Copy link
Contributor Author

Unlike ONNX, which supports multi-dimensional inputs, OpenVINO is restricted to 2D inputs (specifically output_size and kernel_size).

Consequently, the following test imported from ONNX fails due to these differing API specifications.
While this test is expected to pass in ONNX, it correctly fails in OpenVINO.
What would be the best way to handle this discrepancy?
Should I disable the test again?

Thanks.

Test Failed Log
=================================== FAILURES ===================================
_________________ OnnxBackendNodeModelTest.test_col2im_5d_cpu __________________

test_self = <tests.tests_python.test_backend.OnnxBackendNodeModelTest testMethod=test_col2im_5d_cpu>
device = 'CPU', kwargs = {}
model_pb_path = '/venv/lib/python3.11/site-packages/onnx/backend/test/data/node/test_col2im_5d/model.onnx'
model_dir = '/venv/lib/python3.11/site-packages/onnx/backend/test/data/node/test_col2im_5d'
use_dummy = False
model = ir_version: 8
producer_name: "backend-test"
graph {
  node {
    input: "input"
    input: "image_shape"
    input: "b...  dim {
            dim_value: 5
          }
        }
      }
    }
  }
}
opset_import {
  domain: ""
  version: 18
}


    def run(test_self: Any, device: str, **kwargs) -> None:  # noqa: ARG001
        if model_test.url is not None and model_test.url.startswith(
            "onnx/backend/test/data/light/"
        ):
            # testing local files
            model_pb_path = os.path.normpath(
                os.path.join(
                    os.path.dirname(__file__),
                    "..",
                    "..",
                    "..",
                    "..",
                    model_test.url,
                )
            )
            if not os.path.exists(model_pb_path):
                raise FileNotFoundError(f"Unable to find model {model_pb_path!r}.")
            onnx_home = os.path.expanduser(
                os.getenv("ONNX_HOME", os.path.join("~", ".onnx"))
            )
            models_dir = os.getenv(
                "ONNX_MODELS", os.path.join(onnx_home, "models", "light")
            )
            model_dir: str = os.path.join(models_dir, model_test.model_name)
            if not os.path.exists(model_dir):
                os.makedirs(model_dir)
            use_dummy = True
        else:
            if model_test.model_dir is None:
                model_dir = self.prepare_model_data(model_test)
            else:
                model_dir = model_test.model_dir
            model_pb_path = os.path.join(model_dir, "model.onnx")
            use_dummy = False
    
        if not ONNX_ML and "ai_onnx_ml" in model_dir:
            return
    
        model = onnx.load(model_pb_path)
        model_marker[0] = model
        if (
            hasattr(self.backend, "is_compatible")
            and callable(self.backend.is_compatible)
>           and not self.backend.is_compatible(model)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        ):

/venv/lib/python3.11/site-packages/onnx/backend/test/runner/__init__.py:391: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
install/tests/onnx/tests/tests_python/utils/onnx_backend.py:129: in is_compatible
    import_onnx_model(model)
install/tests/onnx/tests/tests_python/utils/onnx_helpers.py:15: in import_onnx_model
    model = core.read_model(bytes(model_byte_string), Tensor(type=np.uint8, shape=[]))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Core: available plugins[CPU]>
model = b'\x08\x08\x12\x0cbackend-test:\xbc\x01\n1\n\x05input\n\x0bimage_shape\n\x0bblock_shape\x12\x06output"\x06Col2Im\x12\x...x12\x1a\n\x18\x08\x01\x12\x14\n\x02\x08\x01\n\x02\x08\x02\n\x02\x08\x03\n\x02\x08\x04\n\x02\x08\x05B\x04\n\x00\x10\x12'
weights = <Tensor: shape[] type: u8>, config = {}

    def read_model(
        self,
        model: Union[str, bytes, object, io.BytesIO],
        weights: Union[object, str, bytes, Tensor, io.BytesIO] = None,
        config: Optional[dict[str, Any]] = None
    ) -> Model:
        config = {} if config is None else config
        if isinstance(model, Model):
            model = model._Model__model
    
        if isinstance(weights, Tensor):
>           return Model(super().read_model(model, weights))
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E           RuntimeError: Exception from src/inference/src/cpp/core.cpp:98:
E           Check 'false' failed at src/frontends/common_translators/src/unconverted_ops_report.cpp:141:
E           FrontEnd API failed with OpConversionFailure:
E           Model wasn't fully converted. Failed operations detailed log:
E           -- Col2Im-18 with a message:
E           While validating ONNX node '<Node(Col2Im): output>': Check 'is_two_elem_1d(output_size_shape)' failed at src/core/shape_inference/include/col2im_shape_inference.hpp:36:
E           While validating node 'opset15::Col2Im Col2Im_1595788 (opset1::Parameter input[0]:f32[1,10,12], opset1::Parameter image_shape[0]:i64[3], opset1::Parameter block_shape[0]:i64[3]) -> (dynamic[...])' with friendly_name 'Col2Im_1595788':
E           Shape inference input shapes {[1,10,12],[3],[3]}
E           output_size must be a 1D input of shape [2]. Got: [3]
E           
E           Summary:
E           -- Conversion is failed for: Col2Im-18

/venv/lib/python3.11/site-packages/openvino/_ov_api.py:597: RuntimeError
_____________ OnnxBackendNodeModelTest.test_constant_pad_axes_cpu ______________
[XPASS(strict)] Constant Pad - RuntimeError: Shape inference of Reference node with name y failed
=============================== warnings summary ===============================
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_cast_FLOAT16_to_FLOAT8E4M3FN_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_cast_FLOAT_to_FLOAT8E4M3FN_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_castlike_FLOAT_to_FLOAT8E4M3FN_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_castlike_FLOAT_to_FLOAT8E4M3FN_expanded_cpu
  /__w/openvino/openvino/install/tests/onnx/tests/runtime.py:101: DeprecationWarning: Deprecated since 1.18. Scheduled to remove in 1.20. Consider using libraries like ml_dtypes for dtype conversion
    converted_buffers.append(float8e4m3_to_float32(np.frombuffer(data_f8, dtype=np.uint8), fn=True, uz=False).reshape(source_buffers[key].shape).view(target_dtype))

install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_cast_FLOAT16_to_FLOAT8E5M2_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_cast_FLOAT_to_FLOAT8E5M2_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_castlike_FLOAT_to_FLOAT8E5M2_cpu
install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_castlike_FLOAT_to_FLOAT8E5M2_expanded_cpu
  /__w/openvino/openvino/install/tests/onnx/tests/runtime.py:98: DeprecationWarning: Deprecated since 1.18. Scheduled to remove in 1.20. Consider using libraries like ml_dtypes for dtype conversion
    converted_buffers.append(float8e5m2_to_float32(np.frombuffer(data_f8, dtype=np.uint8), fn=False, uz=False).reshape(source_buffers[key].shape).view(target_dtype))

install/tests/onnx/tests/tests_python/test_backend.py: 94 warnings
  /venv/lib/python3.11/site-packages/numpy/lib/_function_base_impl.py:2458: DeprecationWarning: Deprecated since 1.18. Scheduled to remove in 1.20. Consider using libraries like ml_dtypes for dtype conversion
    return self.pyfunc(*the_args, **kwargs)

install/tests/onnx/tests/tests_python/test_frontend_onnx.py::test_convert
install/tests/onnx/tests/tests_python/test_frontend_onnx.py::test_convert
install/tests/onnx/tests/tests_python/test_frontend_onnx.py::test_convert
  /venv/lib/python3.11/site-packages/onnx/helper.py:171: DeprecationWarning: Field onnx.AttributeProto.ints: Expected an int, got a boolean. This will be rejected in 7.34.0, please fix it before that
    node.attribute.extend(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: /__w/openvino/openvino/install/tests/TEST-onnx_frontend.graph_iterator_enabled.xml -
=========================== short test summary info ============================
FAILED install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_col2im_5d_cpu - RuntimeError: Exception from src/inference/src/cpp/core.cpp:98:
Check 'false' failed at src/frontends/common_translators/src/unconverted_ops_report.cpp:141:
FrontEnd API failed with OpConversionFailure:
Model wasn't fully converted. Failed operations detailed log:
-- Col2Im-18 with a message:
While validating ONNX node '<Node(Col2Im): output>': Check 'is_two_elem_1d(output_size_shape)' failed at src/core/shape_inference/include/col2im_shape_inference.hpp:36:
While validating node 'opset15::Col2Im Col2Im_1595788 (opset1::Parameter input[0]:f32[1,10,12], opset1::Parameter image_shape[0]:i64[3], opset1::Parameter block_shape[0]:i64[3]) -> (dynamic[...])' with friendly_name 'Col2Im_1595788':
Shape inference input shapes {[1,10,12],[3],[3]}
output_size must be a 1D input of shape [2]. Got: [3]

Summary:
-- Conversion is failed for: Col2Im-18
FAILED install/tests/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_constant_pad_axes_cpu - [XPASS(strict)] Constant Pad - RuntimeError: Shape inference of Reference node with name y failed
= 2 failed, 1465 passed, 240 skipped, 1599 deselected, 315 xfailed, 105 warnings in 51.78s =

@daehyun99
Copy link
Contributor Author

daehyun99 commented Feb 28, 2026

I also apply Copyright rule.

Please review this when you have a moment.

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

category: docs OpenVINO documentation category: ONNX FE OpenVINO ONNX FrontEnd ExternalPR External contributor

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Good First Issue]: Extend ONNX FE with Col2Im operator

4 participants