Skip to content

Fix ModelWrapper method to preserve order of graph inputs and outputs#186

Merged
maltanar merged 3 commits intofastmachinelearning:mainfrom
auphelia:fix/io_tensor_order
Jun 6, 2025
Merged

Fix ModelWrapper method to preserve order of graph inputs and outputs#186
maltanar merged 3 commits intofastmachinelearning:mainfrom
auphelia:fix/io_tensor_order

Conversation

@auphelia
Copy link
Copy Markdown
Collaborator

This PR fixes an issue with set_tensor_shape and the order of input and output tensors in the ONNX GraphProto. When setting the shape of these tensors, they used to be removed and then re-added, which potentially could mess up their order. This wasn't a problem for internal tensors (ValueInfo is unordered), but can lead to a reordering of graph.input or graph.output.

In this PR, the code now fetches the current index of the tensor in graph.input or graph.output. It then inserts the updated tensor with its new shape back into the same spot, keeping the original order intact.

@maltanar
Copy link
Copy Markdown
Collaborator

maltanar commented Jun 6, 2025

Thanks for catching and fixing this @auphelia ! I took the liberty to add a unit test that triggers the problem which addressed by this fix.

@maltanar maltanar merged commit 0630cea into fastmachinelearning:main Jun 6, 2025
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants