Skip to content

Batch inference not faster than single inference #117

@octaAIteam

Description

@octaAIteam

When running inference with XFeat using batch sizes larger than 1, the runtime scales linearly with batch size instead of getting faster (as expected).
For example, with batch size 9, the processing time is ~9× slower than batch size 1. This makes batching ineffective.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions