-
For some reason, I have to use pr-agent with self-hosted litellm proxy. I attempted the following configuration, but it didn't work.Moreover, I've confirmed that my network connection is working properly.
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
I dig into the source code of the pr-agent and made the following configuration changes, but it still doesn't work:
|
Beta Was this translation helpful? Give feedback.
-
i dont know what is "litellm_proxy" maybe you mean "openai proxy" ? anyway, thoroughly review this page: you cannot invent categories like you can give additional parameters as env variables, and open a PR to better support them in configuration |
Beta Was this translation helpful? Give feedback.
-
Finally, I found out what the problem was. It was that the custom_model_max_tokens parameter was not configured in my configuration file. However, in the code flow, this exception thrown in get_max_tokens was not caught, which really confused me. Here is my final version of the configuration file, just for reference:
|
Beta Was this translation helpful? Give feedback.
i dont know what is "litellm_proxy"
maybe you mean "openai proxy" ?
anyway, thoroughly review this page:
https://qodo-merge-docs.qodo.ai/usage-guide/changing_a_model/#custom-models
you cannot invent categories like
[litellm_proxy]
in the configuration. they won't be usedyou can give additional parameters as env variables, and open a PR to better support them in configuration