Skip to content

Issues: onnx/tensorflow-onnx

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Exporting a specific function from a TFLite model to ONNX, instead of just the "default" one enhancement New feature or request tflite Issues related to TensorFlow Lite
#2066 opened Oct 20, 2022 by josephrocca
Support for multiple signatures in exported models contribution welcome Community contribution is welcomed enhancement New feature or request tflite Issues related to TensorFlow Lite
#1837 opened Feb 1, 2022 by leandro-gracia-gil
[BUG] Different inference results of tflite model and onnx model(convert from tflite model) potential bug Error in codebase may cause a bug, but no concrete examples observed tflite Issues related to TensorFlow Lite
#1819 opened Jan 10, 2022 by zhuxiaoxuhit
tranpose optimization for hardswish in tensorflow-lite need investigation Need investigation tflite Issues related to TensorFlow Lite
#1816 opened Jan 4, 2022 by Xiadalei
ProTip! Type g i on any issue or pull request to go back to the issue listing page.