Curious why this creates 2GB .ckpt files #138
-
|
I'm curious what would cause this to create a 2GB .ckpt file when the HuggingFace v1-4 file is 4GB? Is this expected? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 3 replies
-
|
The 2GB is in half precision which is what is widely used, the quality is exactly the same and a faster loading |
Beta Was this translation helpful? Give feedback.
-
|
For the record for anyone that's curious: it's not just this implementation, the JoePenna Dreambooth also produces 2GB checkpoints (and probably most others). |
Beta Was this translation helpful? Give feedback.
-
|
Trained a model on standard google colab GPU, it maxed out GPU memory but run without crashing. However, the resulting model is just 2GB, and the quality of generated images is poor. When I train on "premium" GPU, I get 4GB files and the quality is better. Is there a correlation? Thx |
Beta Was this translation helpful? Give feedback.
The 2GB is in half precision which is what is widely used, the quality is exactly the same and a faster loading