This repository was archived by the owner on May 29, 2025. It is now read-only.
Replies: 1 comment 2 replies
-
|
Not planned right now as this is a latency optimized single user implementation. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
BATCH inference can greatly improve the inference speed. Is there any plan to support it?
Beta Was this translation helpful? Give feedback.
All reactions