Does the official script of fish-speech 1.5 support batch_size > 1 during the inference stage? #941
Unanswered
Ht-zhang-xianyu
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I followed the guide from fish-speech/docs/zh/inference.md to run inference using the official model. However, I'm wondering if the inference script supports batch_size > 1 during the inference stage. If so, what changes should I make to the inference code?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions