Description
When prompting the LLM through the proxy for something that does not require any of the tools, (Ex: "Write me a poem about the moon"
or "How are you today"
) I get Error: 500 {"tool_calls":[]}
.
I see there is logic in place for is_normal_chat
but as long as the user has any function defined this variable can never be true.
Also, if forcing is_normal_chat
, I get a different error Error calling GroqProvider: Object of type ChatCompletion is not JSON serializable
.
And I can see with some debug tests that the LLM is returning a good response, but the proxy is messing up the parsing of it.
I'm testing and trying to find a different check to be implemented so we can have tools defined but still ask a normal question, but I'm not even sure it's possible without major code changes, so it might take a while.
Maybe @unclecode or someone more skilled than me can find a solution first 👍