bug fix of multi batch size and support onnx of yolo-s model#296
Open
wufei-png wants to merge 2 commits intoAILab-CVC:masterfrom
Open
bug fix of multi batch size and support onnx of yolo-s model#296wufei-png wants to merge 2 commits intoAILab-CVC:masterfrom
wufei-png wants to merge 2 commits intoAILab-CVC:masterfrom
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
fix: txt_feats should repeat batch_size times:
when export onnx model with batch size > 1 the
img_featsshape is:however,the
txt_featsis always :torch.Size([1, 80, 512])it should betorch.Size([batch_size, 80, 512])to avoid the export error like this:chore: support onnx export:
I use YOLO-World-S model to export onnx format,have the err same as this issue:
I referenced the solution in this issue to add the max and avgpool that is compatible with onnx exports
Now export onnx has two compatible problems: einsum and pool layer, and I don't want to add a new bool variable (because other new compatibility problems may occur in future) so I replaced bool use_einsum with export_onnx.
The code has been tested in my local env.
-------------comment at 5.18:

@wondervictor
I notice that this pr have a small conflict about code formatting after recent main branch's code update, which has now been modified and is ready to be merged in: