Inference speed/requirements? #44
DavidNTompkins
started this conversation in
General
Replies: 2 comments 2 replies
-
|
There is another repo by Fenrir with the istft mb stuff. I wanna add it to this repo or make another extension if it is good enough / worth the efforts. |
Beta Was this translation helpful? Give feedback.
2 replies
-
|
My colleague @DavidSamuell has conducted inference speed tests on CPU, GPU, ONNX CPU, and ONNX GPU. You can find the full results here.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Amazing project! I'm starting a training run now and was wondering if there were more details on inference speed on CPU / minimum requirements to run inference? I saw that it's lightning fast on a v100 in the paper - any ideas how it'll perform without GPU?
Ideally, I'm trying to compare to MB-iSTFT-VITS, which claims a ~15x realtime on an i7 CPU. (https://github.com/MasayaKawamura/MB-iSTFT-VITS).
Thanks either way!
Beta Was this translation helpful? Give feedback.
All reactions