Description
When using LiteLLM as a gateway with MCP servers that expose tools containing $ref references in their JSON schemas (e.g. DevRev MCP), requests to Anthropic and Fireworks providers fail with schema resolution errors. The same request succeeds on OpenAI/GPT models.
Error Messages
Anthropic (e.g. claude-haiku, custom models routed to Anthropic):
litellm.BadRequestError: AnthropicException - {"error":{"type":"invalid_request_error","message":"Error resolving schema reference '#/definitions/_gen:tags': PointerToNowhere(ref='/definitions/_gen:tags', resource=Resource(contents={...tool schema without definitions block...}))"}}
Fireworks (e.g. kimi-k2p5, kimi-k2p6, minimax-m2p7):
litellm.BadRequestError: Fireworks_aiException - {"error":{"type":"invalid_request_error","message":"Error resolving schema reference '#/definitions/_gen:tags': AttributeError(\"'NoneType' object has no attribute 'lookup'\")"}}
Root Cause
MCP servers can return tool schemas that use $ref with a top-level definitions block, e.g.:
{
"name": "update_contact",
"input_schema": {
"type": "object",
"properties": {
"tags": {
"$ref": "#/definitions/_gen:tags",
"description": "Tags associated with the contact"
}
},
"definitions": {
"_gen:tags": {
"type": "object",
"properties": { ... }
}
}
}
}
When LiteLLM forwards tool definitions to downstream providers, the definitions block is either not included or not resolved inline. Anthropic and Fireworks validate schemas strictly and reject unresolved $ref pointers. OpenAI is more permissive and resolves them server-side.
Reproduction
- Set up a LiteLLM gateway with an MCP server that exposes tools with
$ref in their schemas (e.g. DevRev MCP)
- Make a request routed to an Anthropic-backed or Fireworks-backed model group
- LiteLLM forwards tool schemas with unresolved
$ref → provider rejects with schema resolution error
- Same request to an OpenAI-backed model group → succeeds
Expected Behavior
LiteLLM should resolve/flatten all $ref references in tool input_schema before forwarding to providers that don't support them. This can be done using Python's jsonschema library:
import jsonschema
from jsonschema import RefResolver
def resolve_refs(schema: dict) -> dict:
"""Flatten $ref references inline so providers don't need to resolve them."""
resolver = RefResolver.from_schema(schema)
# walk the schema and inline all $refs
...
Alternatively, apply this only for providers known to be strict (Anthropic, Fireworks) and skip for OpenAI which handles refs natively.
Environment
- LiteLLM version: latest (via Razorpay internal gateway)
- Failing providers:
anthropic, fireworks_ai
- Working providers:
openai
- MCP source: DevRev MCP (tools with
$ref to #/definitions/_gen:* types)
- Client: opencode (sst/opencode) using
@ai-sdk/openai provider pointed at LiteLLM gateway
Related
This issue affects any MCP server that uses $ref with definitions in its tool schemas, not just DevRev. It's a systematic gap in LiteLLM's provider compatibility layer.
Description
When using LiteLLM as a gateway with MCP servers that expose tools containing
$refreferences in their JSON schemas (e.g. DevRev MCP), requests to Anthropic and Fireworks providers fail with schema resolution errors. The same request succeeds on OpenAI/GPT models.Error Messages
Anthropic (e.g. claude-haiku, custom models routed to Anthropic):
Fireworks (e.g. kimi-k2p5, kimi-k2p6, minimax-m2p7):
Root Cause
MCP servers can return tool schemas that use
$refwith a top-leveldefinitionsblock, e.g.:{ "name": "update_contact", "input_schema": { "type": "object", "properties": { "tags": { "$ref": "#/definitions/_gen:tags", "description": "Tags associated with the contact" } }, "definitions": { "_gen:tags": { "type": "object", "properties": { ... } } } } }When LiteLLM forwards tool definitions to downstream providers, the
definitionsblock is either not included or not resolved inline. Anthropic and Fireworks validate schemas strictly and reject unresolved$refpointers. OpenAI is more permissive and resolves them server-side.Reproduction
$refin their schemas (e.g. DevRev MCP)$ref→ provider rejects with schema resolution errorExpected Behavior
LiteLLM should resolve/flatten all
$refreferences in toolinput_schemabefore forwarding to providers that don't support them. This can be done using Python'sjsonschemalibrary:Alternatively, apply this only for providers known to be strict (Anthropic, Fireworks) and skip for OpenAI which handles refs natively.
Environment
anthropic,fireworks_aiopenai$refto#/definitions/_gen:*types)@ai-sdk/openaiprovider pointed at LiteLLM gatewayRelated
fix(mcp): resolve $ref params in OpenAPI spec) — handles$refin OpenAPI spec parsing but not in toolinput_schemaforwardingfix: strip $schema from tool input_schema) — adjacent schema handling work$refresolution failure in structured outputs (closed)This issue affects any MCP server that uses
$refwith definitions in its tool schemas, not just DevRev. It's a systematic gap in LiteLLM's provider compatibility layer.