Replies: 4 comments 1 reply
-
@lenzenc Currently, promptflow only support models deployed to AML through model catalog, local model is not supported. + @prakharg-msft |
Beta Was this translation helpful? Give feedback.
-
Any update on whether Promptflow has support for locally deployed models? |
Beta Was this translation helpful? Give feedback.
-
@lenzenc I think you can use some api proxy system like one-api or new-api. I use it now |
Beta Was this translation helpful? Give feedback.
-
For everyone also stumbling across the topic, you could use a simple proxy script. Then use a Custom Connection in Prompt Flow and set the Base URL to the URL of the proxy server (localhost:8080)
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is there any support for or example of custom code that can be used to call local models via Ollama?
Beta Was this translation helpful? Give feedback.
All reactions