fix(llm): improve message handling to support LLMs without content/tool_calls#635
Conversation
…ol_calls This commit improves the message handling in the LLM class to gracefully handle messages without 'content' or 'tool_calls' fields. Previously, the system would raise a ValueError when encountering such messages, causing crashes when working with models like Google's Gemini that sometimes return messages with different structures. Key changes: - Reordered message processing to check for Message objects first - Changed validation approach to silently skip malformed messages instead of crashing - Removed the strict ValueError when content/tool_calls are missing This change maintains compatibility with correctly formatted messages while improving robustness when working with various LLM providers.
|
LGTM |
|
Please run pre-commit to format code |
|
|
|
Ran pre-commit, I was so sure that I had already commited the changes, i even mentioned running pre-commit in the description. But now i pushed the changes.
Perhaps, but I didn't make that change and it has been like that for a long time and it is a more involved and bigger task since we need to check if it breaks something somewhere since it has been the name for a while. Probably more suitable for another pull request perhaps? |
|
That's Good! Thank you ~ |
|
For https://platform.openai.com/docs/guides/function-calling?api-mode=chat#additional-configurations |
My code should also allow message without |
|
app.llm:ask_tool:763 - OpenAI API error: Error code: 400 - [{'error': {'code': 400, 'message': '* GenerateContentRequest.contents: contents is not specified\n', 'status': 'INVALID_ARGUMENT'}}] |
…error-when-using-gemini fix(llm): improve message handling to support LLMs without content/tool_calls
Features
LLMclass to gracefully handle messages withoutcontentortool_callsfields.Messageobjects first.ValueError.contentortool_callsfields.This fixes:
Feature Docs
N/A (Minor change to improve compatibility with Google's Gemini LLM.)
Influence
This change ensures compatibility with LLMs like Google's Gemini, which may return messages without
contentortool_callsfields once in a while, they always seem useless anyway. It improves robustness and prevents crashes when working with such models.Result
app/llm.pywithblack.Other
This is a small but impactful change that unlocks support for Google's Gemini LLM while maintaining compatibility with existing models. And solves multiple issues.
Let me know if you’d like further adjustments!