Skip to content

Conversation

@satyadevai
Copy link
Collaborator

@satyadevai satyadevai commented Dec 16, 2025

Closes #2140


Note

Emit tool JSON schemas on LLM_TOOLS for BeeAI chat spans and add examples, tests, and CI/deps updates.

  • BeeAI Instrumentation:
    • Emit per-tool JSON schema on LLM_TOOLS.{index}.TOOL_JSON_SCHEMA using new helpers get_tool_parameters/get_tools in processors/chat.py (replaces prior tool-name list).
    • Include invocation parameters and message processing unchanged; tighten typings and safe JSON handling.
  • Examples:
    • Add examples/beeai_ax.py (Arize OTEL) and examples/beeai_phoenix.py (OTLP -> Phoenix) demonstrating multi-agent tool handoffs and instrumentation setup.
  • Tests:
    • Add tests/.../test_instrumentor.py asserting token/cost metrics, tool call attributes, and emitted tool schemas.
    • Add VCR cassettes for LLM and tool-call scenarios.
  • Build/CI:
    • Update pyproject.toml test extras (pytest, pytest-asyncio, pytest-recording, vcrpy, duckduckgo extra, OTLP exporter).
    • Adjust tox.ini envlist for BeeAI to Python 3.11/3.13.

Written by Cursor Bugbot for commit 8bc8bc6. This will update automatically on new commits. Configure here.

@satyadevai satyadevai force-pushed the 2140-beeai-tools-issue branch from f214684 to a52008f Compare December 18, 2025 19:16
@satyadevai satyadevai marked this pull request as ready for review December 18, 2025 20:10
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Dec 18, 2025
@satyadevai satyadevai force-pushed the 2140-beeai-tools-issue branch from f118e45 to 6e51576 Compare December 19, 2025 15:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

[bug] Beeai - LLM spans are not properly capturing tools in llm attribute

1 participant