Replies: 1 comment
-
|
Appreciate you flagging this one: on our radar. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm experiencing a token counting inconsistency when using the latest DeepSeek v3.1 model with Crush. The model returns a 400 error indicating that tokens are exceeding the limit, but the token statistics displayed in the sidebar show different (presumably lower) numbers.
Note: I'm not sure if this is a common problem affecting all providers or specifically limited to DeepSeek v3.1.
POST "https://api.deepseek.com/v1/chat/completions": 400 Bad Request {"message":"This model's maximum context length is 131072 tokens. However, you requested 143518 tokens (138518 in the messages, 5000 in the completion). Please reduce the length of the messages or
Beta Was this translation helpful? Give feedback.
All reactions