Replies: 1 comment
-
|
You can use this script, it converts to safetensors and reshards https://github.com/oobabooga/text-generation-webui/blob/main/convert-to-safetensors.py |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My computer kind of dies for a bit trying to allocate enough RAM for the entire Llama 13B model in one go, but has no issues with other models in a split format. Is there any way to convert an existing monolithic model into smaller pieces for easier loading?
Beta Was this translation helpful? Give feedback.
All reactions