Skip to content

Using Python to call this library open-interpreter encountered litellm.exceptions.BadRequestError exception #1603

Open
@jqsl2012

Description

@jqsl2012

Describe the bug

  1. I succeeded in interpreter command line mode (both openai and non-openai models succeeded)
  2. I failed in python. (However, python mode succeeded in openai model, but litellm.exceptions.BadRequestError always occurred in other models)

I tried both. My input problem was very simple, which was this: What's 34/24?

Reproduce

  1. For command line and python mode, I refer to Does it support Qwen series hosted model? #1572, but I use stream=True mode

Expected behavior

I hope the python mode request can succeed

Screenshots

This is the command line mode to request a non-OpenAI model, which succeeded

Image

Python mode request for non-OpenAI models failed.

litellm.llms.openai.common_utils.OpenAIError: {"error":{"message":"deepseek-chat is not a valid model ID","code":400},"user_id":"user_2dzIMQc2DXdaKhwdKz81nPS4sqH"}

litellm.exceptions.BadRequestError: litellm.BadRequestError: DeepseekException - {"error":{"message":"deepseek-chat is not a valid model ID","code":400},"user_id":"user_2dzIMQc2DXdaKhwdKz81nPS4sqH"}

Open Interpreter version

0.4.3

Python version

Python 3.9.13

Operating System name and version

centos7

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions