Skip to content

Look into ONNX format for model inference #18

@stefanpantic

Description

@stefanpantic

Tensorflow models can be quite slow. ONNX is a format for storing and running ML models that is pretty fast and easy to use from different programming languages (i. e. C++). Look into and provide a Proof-of-Concept for implementing a conversion function from Tensorflow frozen graphs to ONNX graphs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestquestionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions