-
Notifications
You must be signed in to change notification settings - Fork 6.4k
fixV2(core): Resolve PydanticSerializationError for Google FunctionCall #20059
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you mean to commit this file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no it was intented for local testing only. I have deleted the file now.
from llama_index.core.schema import ImageDocument | ||
from llama_index.core.utils import resolve_binary | ||
|
||
GOOGLE_FUNCTION_CALL_AVAILABLE = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's no need to import types from google here
serializable_tool_calls = [] | ||
for tc in original_tool_calls: | ||
# If we find a FunctionCall object, convert it to a dict | ||
if isinstance(tc, FunctionCall): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need to check for FunctionCall
, we should be checking for whatever the root type of FunctionCall
is
Looking at the source code, its a base model. Which is actually already being checked in self._recursive_serialization(value)
-- but I think its not recursing nicely into the pydantic object?
Solution: I fixed a recursion bug in the _recursive_serialization helper function. The corrected logic now properly handles any nested BaseModel, making the serialization robust and generic. This is a more durable fix than adding a special case for a single type.
Solution: I updated the validator logic in mem0/base.py to work with a dict (changing values.data.get() to values.get()), aligning the code with its actual runtime behavior and resolving the crash.
Solution: I added all missing development dependencies (markdownify, lxml[html_clean], playwright, selenium, oxylabs, etc.) to the package's pyproject.toml, which are now tracked in the updated poetry.lock file. To fix the PackageNotFoundError, I made the OxylabsWebReader code more robust by wrapping the version() call in a try...except block. It now defaults to a "local" string instead of crashing when run in a local test environment.
Solution: Since we can't fix the external dependency, I pinned cognee-python to an older, stable version (0.1.26) in the package's pyproject.toml to avoid the buggy code. |
The last fix passed all the checks but when "[Merge branch 'main' into fix/postgresV2]" happened, this showed up in every unit test |
Description
The core fix is to enhance the Pydantic serialization within the base ChatMessage class to explicitly handle the Google SDK's FunctionCall object. The _recursive_serialization method in llama_index/core/base/llms/types.py is updated to check for this specific external object type and convert it into a standard Python dictionary using its internal .to_dict() method. This prevents the PydanticSerializationError from being thrown when the ChatMessage is dumped to JSON for storage.
Fixes # (issue)
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
uv run make format; uv run make lint
to appease the lint gods