FYI: the MOSS model is really good #1813
Replies: 3 comments 2 replies
-
|
Need to try this, been looking for something like this instead of what im using now. ( vicuna ) |
Beta Was this translation helpful? Give feedback.
-
|
ill try and get starcoder and santacoder and CodeCapybara to work :) MOSS is looking too, @TheBloke can you please quantize it :) |
Beta Was this translation helpful? Give feedback.
-
|
is it just me or does it require GPU? Works ok if i enable GPU but if i go CPU only it complains about running scrips ( even tho i have --trust-remote-code in the command line in both cases ). My laptop does not have a real GPU ( intel :( ) so i always test models both ways on my server. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
So far, I had been using llama-30b + chansung/alpaca-lora-30b for coding questions, loaded with the monkey patch:
But MOSS seems to be better. It gives verbose replies and detailed code. It also seems to be more detailed than oasst-sft-6-llama-30b in my quick tests.
To load it:
It uses about 20gb vram in 8-bit mode.
Beta Was this translation helpful? Give feedback.
All reactions