Replies: 2 comments 3 replies
-
|
what model are you using, i cant find one on HF for alpaca-7b-native-GPTQ? |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
I found alpaca-native in fp16 to be very good.. for me there was no point in using 4-bit. It stayed in character, carl actually knew what ATHF was. The open assistant pythia based model is very good too. Similar to alpaca. These are the first 2 that impressed me to be honest. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using the alpaca-7b-native-GPTQ thanks to the new fix and we can get really great outputs if you choose a good example at the begining.
Here's the character.json I'm using right now:
I used as an example a poem written by GPT 4 and here's the result when I ask him to do another poem:
I think that a finetuning of lama with GPT 4 data would result on a really great model on it's own!
Beta Was this translation helpful? Give feedback.
All reactions