-
|
When chatting to any of the Mistral models hosted on Foundry using a Dotnet SDK, you get the following error message when using streaming. Tested on
All 3 of the above SDKs use the underlying OpenAI.Chat library which hardcodes includeUsage: true as per https://github.com/openai/openai-dotnet/blob/5f13e29de395d2824c0e9c694a4ce5c65edb5392/src/Custom/Chat/ChatClient.cs#L33 Here are the request/response pairs Endpoint: Any ideas for how to get around this, or, is there a more appropriate repo to post it to? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
|
The ironic thing is that if you remove |
Beta Was this translation helpful? Give feedback.
-
|
Hi @davidames - thanks for the detail above - can I ask a couple of clarifying questions below so we know how best to route this discussion? Was the issue occurring only with Mistral models? (have other models been tested to validates if its the SDK) |
Beta Was this translation helpful? Give feedback.
-
|
Hi @davidames Thanks for raising this up. And yes, it's only triggers this error when using Mistral models. It seems to work fine with GPT, DeepSeek and Phi models. I may suggest you to fill an issue on the official OpenAI client: https://github.com/openai/openai-dotnet/issues Best |
Beta Was this translation helpful? Give feedback.



Hi @davidames
Thanks for raising this up. And yes, it's only triggers this error when using Mistral models.
I tested with Mistral-Nemo and Mistral-Large-2411.
It seems to work fine with GPT, DeepSeek and Phi models.
I may suggest you to fill an issue on the official OpenAI client: https://github.com/openai/openai-dotnet/issues
Error, not a fun time 👎
Best