Skip to content

[Bug]: Tool call arguments are being appended to the end of the answer. #1225

@ricac1

Description

@ricac1

OpenRAG Version

0.3.2

Deployment Method

Docker

Operating System

macOS Tahoe 26.2

Python Version

3.13.12

Affected Area

Chat (chat interface, conversations, AI responses)

Bug Description

The model (meta-llama/llama-3-405b-instruct model) is returning correct natural language answers, however, the problem is that the tool call arguments (for example, {“search_query": "..."}) are being appended to the end of the response text.

Examples:

Image Image

Steps to Reproduce

I uploaded a document to ingest, waited for ingestion to complete, and started chatting with the document by filtering for that specific one.

Expected Behavior

The response text should not have the JSON query appended at the end.

Actual Behavior

When I ask a question in Chat, the JSON query appears for each response text.

Relevant Logs

Screenshots

No response

Additional Context

No response

Checklist

  • I have searched existing issues to ensure this bug hasn't been reported before.
  • I have provided all the requested information.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug🔴 Something isn't working.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions