Replies: 1 comment
-
I had the same issue, long responses would just get cut off. After following the instructions here (manually setting max_new_tokens to 1024), long prompts don't get cut off for me any more Hope that helps |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
For example, just to test it, if i ask it to write me a story with 1000 words, the response just cuts off at a certain point, without reaching the word count.
To test it, i installed the GPT4ALL desktop version with the same model and that one works without issues and writes it fully.
Any solution?
Beta Was this translation helpful? Give feedback.
All reactions