Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Best practice for remote APIs that require a proxy #519

Open
sahuguet opened this issue Jun 21, 2024 · 3 comments
Open

Best practice for remote APIs that require a proxy #519

sahuguet opened this issue Jun 21, 2024 · 3 comments

Comments

@sahuguet
Copy link

In some corporate environment, there is no direct access to remote API.

One option is provide support for proxies (see #518), e.g. by having llm let specify an alternate endpoint. When adding a new model via a plugin, this would mean asking for the api key AND an optional endpoint.

Another option is to create a new plugin for each model (or family of models) that requires a proxy access, and the Python code of the plugin would point to the proxy instead of the default endpoint.

@simonw
Any recommendation about the best way to proceed?

@flexchar
Copy link

Could you use something like LiteLLM running alongside that you perform all types of complex config and use with LLM?

@AlexanderYastrebov
Copy link

See also related #382

@AlexanderYastrebov
Copy link

Could you use something like LiteLLM running alongside that you perform all types of complex config and use with LLM?

Yes, see #518 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants