Apologies if this has been opened elsewhere, but I couldn't find it.
About
The deepseek-chat context window is 131072 tokens, not 1280000.
Here's the error:
{"error":{"message":"This model's maximum context length is 131072 tokens. However, you requested 131749 tokens (131749 in the messages, 0 in the completion). Please reduce the length of the messages or completion.","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}