-
Notifications
You must be signed in to change notification settings - Fork 273
Description
Description
The create_alert_rule and update_alert_rule tools emit bare boolean JSON Schema values (e.g., "model": true) in their tool definitions. While valid per the JSON Schema specification (true means "accept any value"), LLM providers like Fireworks AI reject these with a 500 internal_server_error.
Reproduction
- Connect mcp-grafana to any LLM gateway that routes to Fireworks AI
- Include
create_alert_ruleorupdate_alert_rulein the tool list - Send a chat completion request with 26+ tools (including the alert rule tools)
- Fireworks returns:
{"error":{"object":"error","type":"internal_server_error","code":"invalid_request_error","message":"server had an error while processing your request, please retry again after a brief wait"}}
With 25 tools (excluding alert rule tools): succeeds.
With just create_alert_rule alone: fails.
Root Cause
The alert rule params include Go interface{} fields (likely QueryData or the alert rule model). When reflected to JSON Schema via the jsonschema library, interface{} produces a bare true. Fireworks AI's schema parser does not handle bare boolean schemas.
Affected Tools
create_alert_rule— has"model": truein schemaupdate_alert_rule— same
Expected Behavior
Bare boolean schemas should be replaced with equivalent object schemas:
true→{"type": "object"}(or more specific if the type is known)false→{"not": {}}
Workaround
We applied a sanitizeSchema() function in our MCP adapter layer that recursively converts bare booleans before passing schemas to the LLM provider. See: androidStern-personal/openclaw-mcp-adapter#9
Environment
- mcp-grafana: Docker image
grafana/mcp-grafana(latest as of 2026-02-23) - Grafana: 11.x with Editor-role service account
- LLM Provider: Fireworks AI (Kimi K2.5 model)
- Gateway: LiteLLM v1.81.13 + OpenClaw v2026.2.22