You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In some corporate environment, there is no direct access to remote API.
One option is provide support for proxies (see #518), e.g. by having llm let specify an alternate endpoint. When adding a new model via a plugin, this would mean asking for the api key AND an optional endpoint.
Another option is to create a new plugin for each model (or family of models) that requires a proxy access, and the Python code of the plugin would point to the proxy instead of the default endpoint.
@simonw
Any recommendation about the best way to proceed?
The text was updated successfully, but these errors were encountered:
In some corporate environment, there is no direct access to remote API.
One option is provide support for proxies (see #518), e.g. by having
llm
let specify an alternate endpoint. When adding a new model via a plugin, this would mean asking for the api key AND an optional endpoint.Another option is to create a new plugin for each model (or family of models) that requires a proxy access, and the Python code of the plugin would point to the proxy instead of the default endpoint.
@simonw
Any recommendation about the best way to proceed?
The text was updated successfully, but these errors were encountered: