Open
Description
Describe the bug
I am trying to convert a TF-trained 3DUNet. Below is the head and the tail of the model's summary:
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_1 (InputLayer) [(None, 100, 100, 100, 1)] 0 []
...
activation (Activation) (None, 100, 100, 100, 1) 0 ['conv3d_18[0][0]']
Conversion command:
python -m tf2onnx.convert --saved-model /data/model.saved_model --output /data/model.onnx --tag serve --signature_def serving_default --opset 18 --inputs-as-nchw input_1 --outputs-as-nchw activation
model.onnx
is written, but I'm getting the following two warnings in the output:
WARNING - transpose_input for input_1: shape must be rank 4, ignored
WARNING - transpose_output for activation: shape must be rank 4, ignored
NCHW arguments indeed don't seem to have any effect. When I run the model in PyTorch on a (1, 1, 100, 100, 100)
input, the output shape is (1, 100, 100, 100, 1)
. It's unclear to me why the input is accepted in NCHW format despite the warnings above.
What I expect is the converted model to take (batch_size, 1, 100, 100, 100)
inputs and output (batch_size, 1, 100, 100, 100)
tensors.
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Ubuntu 20.04
- TensorFlow Version: 2.13.1
- Python version: 3.11.9
- ONNX version (if applicable, e.g. 1.11*): 1.16.2
- ONNXRuntime version (if applicable, e.g. 1.11*):
To reproduce
I am new to tf2onnx so I would first like to establish if this is the intended behaviour. If not, I'll try to create a MWE.