Custom OpenAI API Support for AI Foundry #207
Unanswered
DanielQ-CV
asked this question in
Ask Me Anything (AMA)
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Add Support for Custom OpenAI-Compatible APIs in Ai Foundry/Copilot Studio
Summary
Can't use custom OpenAI-compatible endpoints (LiteLLM, OpenRouter) in AI Foundry/Copilot Studio, despite them being fully compatible with OpenAI API standard.
Current Gap
Request
Add ability to:
/v1/modelsendpointUse Case
Using LiteLLM Proxy to access multiple LLM providers through unified interface, want to use these in Copilot Studio without workaround code.
Is this possible or planned?
Beta Was this translation helpful? Give feedback.
All reactions