Replies: 1 comment
-
|
It's been a little while since I trained these models, but I believe this is to be expected. There's three different ESRGAN models - slim, medium, thick - each with different hyper parameters that affect the size (and performance) of the resultant model. However, within each model the scale factor doesn't have a big effect on the model. Scale (I believe) only affects the final output layer, so it's to be expected that all scales for a given model should be roughly the same inference time. If you compare two different models (i.e., slim vs thick) you should see faster inference for slim. With that all said, please feel free to open up a codepen or similar and I'd be happy to take a deeper look. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Here is my code. All models I tried took 17 seconds to finish same image. Why?
And this is how I call it.
Beta Was this translation helpful? Give feedback.
All reactions