Skip to content

[Bug]: Failed to call /v1/messages with Non-Anthropic Models #26697

@Fuyubai

Description

@Fuyubai

Check for existing issues

  • I have searched the existing issues and checked that my issue is not a duplicate.

What happened?

I want to proxy a local openai-style model with litellm, and i want to call this model with antrophic-style, i.e. /v1/messages

Normally, litellm should translate /v1/messages to /v1/chat/completions, but litellm seems to do nothing in translating between them, and i receive a error {"detail":"Not Found"}

Steps to Reproduce

LiteLLM Configuration File

model_list:
  - model_name: Qwen3-32B
    litellm_params:
      model: openai/Qwen3-32B
      api_base: "http://ip:port/v1"
      api_key: "sk-xx"

Initialization Command
litellm --config config.yaml --detailed_debug

LiteLLM Version

litellm                            1.82.3
litellm-enterprise                 0.1.39
litellm-proxy-extras               0.4.69

Relevant log output

POST Request Sent from LiteLLM:
curl -X POST \
http://IP:PORT/v1/responses \
-H 'Authorization: Be****h7' \
-d '{'model': 'Qwen3-32B', 'input': [{'type': 'message', 'role': 'user', 'content': [{'type': 'input_text', 'text': 'Hello, can you tell me a short joke?'}]}], 'max_output_tokens': 100}'

litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenAIException - {"detail":"Not Found"}.

What part of LiteLLM is this about?

Proxy

What LiteLLM version are you on ?

1.82.3

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions