Skip to content

bug: Issue rendering tool calls made with Langchain 1.0 #11850

@dahnny012

Description

@dahnny012

Describe the bug

Summary

After upgrading to LangChain 1.0, tools are no longer serialized in the expected LangGraph / OpenAI “function tool” format. As a result, downstream consumers (e.g. LangGraph adapters / tracing UIs) fail to recognize tools correctly unless custom patching is applied.

This appears to be caused by a change in how tools are represented internally and passed through callbacks.

Expected behavior

Tools should be serialized in a standard function-tool format that downstream systems can reliably detect and render, for example:

{
  "type": "function",
  "function": {
    "name": "example_tool",
    "description": "Example tool description",
    "parameters": {
      "type": "object",
      "properties": {
        "query": { "type": "string" }
      },
      "required": ["query"]
    }
  }
}

This format is relied upon by LangGraph-style adapters and tracing/observability tools to:

  • Detect tool definitions
  • Associate tool calls with tool results
  • Render tools distinctly from normal messages

Actual behavior in Langchain 1.0

Actual behavior in LangChain 1.0

With LangChain 1.0, tools (e.g. StructuredTool) are now serialized roughly as:

{
  "name": "example_tool",
  "description": "Example tool description",
  "args_schema": { ... },
  "return_direct": false,
  "verbose": false,
  ...
}

Notably:

  • The type: "function" field is missing
  • The function container is missing
  • Tool schemas are exposed via args_schema instead of parameters

When these objects are forwarded as-is through callbacks (e.g. via invocation_params["tools"]), downstream systems do not recognize them as tools and instead treat them as generic JSON payloads.

Solution

I have temporarily patched the handler like so

class PatchedLangfuseCallbackHandler(CallbackHandler):
    def on_llm_start(self, *args, **kwargs):
        # 1) Shallow-copy kwargs so we don't mutate the original dict
        patched_kwargs = dict(kwargs)

        # 2) Copy invocation_params if present
        invocation_params = patched_kwargs.get("invocation_params")
        if isinstance(invocation_params, dict):
            invocation_params = dict(invocation_params)  # copy nested dict
            patched_kwargs["invocation_params"] = invocation_params

            tools = invocation_params.get("tools")
            if isinstance(tools, list):
                invocation_params["tools"] = [
                    self._serialize_tool(t) for t in tools
                ]

        # 3) Call upstream logic with patched kwargs
        return super().on_llm_start(*args, **patched_kwargs)
    

    def on_chat_model_start(self, *args, **kwargs):
        # 1) Shallow-copy kwargs so we don't mutate the original dict
        patched_kwargs = dict(kwargs)

        # 2) Copy invocation_params if present
        invocation_params = patched_kwargs.get("invocation_params")
        if isinstance(invocation_params, dict):
            invocation_params = dict(invocation_params)  # copy nested dict
            patched_kwargs["invocation_params"] = invocation_params

            tools = invocation_params.get("tools")
            if isinstance(tools, list):
                invocation_params["tools"] = [
                    self._serialize_tool(t) for t in tools
                ]

        # 3) Call upstream logic with patched kwargs
        return super().on_llm_start(*args, **patched_kwargs)

    def _serialize_tool(self, tool: Any) -> Dict[str, Any]:
        from langchain_core.tools import StructuredTool
        """
        Output the exact "LangGraph tool definition" schema that your adapter expects:

        {
          "type": "function",
          "function": { "name": ..., "description": ..., "parameters": ... }
        }
        """
        # Case 1: StructuredTool
        if StructuredTool is not None and isinstance(tool, StructuredTool):
            return self._serialize_structured_tool(tool)

        # Case 2: APITool
        if isinstance(tool, APITool):
            return self._serialize_api_tool(tool)

        # Last resort for unknown tool object
        return {
            "type": "function",
            "function": {
                "name": getattr(tool, "name", tool.__class__.__name__),
                "description": getattr(tool, "description", None) or None,
                "parameters": getattr(tool, "parameters", None) or {"type": "object", "properties": {}},
            },
        }

    def _serialize_structured_tool(self, tool: StructuredTool) -> Dict[str, Any]:
        name = getattr(tool, "name", "")
        description = getattr(tool, "description", None)
        parameters = tool.args

        return {
            "type": "function",
            "function": {
                "name": name,
                **({"description": description} if description else {}),
                "parameters": parameters,
            },
        }

Steps to reproduce

Use langchain-1.0 and langgraph with tool calling. When tool calling the invocation_parms should use Structured tools.

Langfuse Cloud or self-hosted?

Self-hosted

If self-hosted, what version are you running?

3.148

SDK and integration versions

3.11

Additional information

No response

Are you interested in contributing a fix for this bug?

No

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions