LiteLLM integration in Prism #629
Unanswered
agilan-selvam-zoomrx
asked this question in
Q&A
Replies: 2 comments
-
|
Am getting "OpenAI: unknown finish reason" error when i try to call the models via litellm using prism. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
This is how I have implemented it.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Everyone,
We have Litellm as gateway for all the llm providers in our organization. Does Prism supports Litellm? If yes, Can someone shed light on how to configure prism for LiteLLM?
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions