Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for proxies when talking to remote APIs by overriding the endpoint #518

Open
sahuguet opened this issue Jun 20, 2024 · 2 comments
Open

Comments

@sahuguet
Copy link

Most serious companies block direct access to LLM apis (OpenAI, Anthropic, etc._ and mandate the use of an internal proxy.

Registering a new model by just specifying the key is not enough.
You should be able to provide the API key AND the end point to be used.

Library like curl. or requests are good examples in terms of support for proxies.

@sahuguet
Copy link
Author

Here are the ways the 3 main model providers support alternate endpoints.

OpenAI

For OpenAI, the AzureOpenAI offers a parameter to select the endpoint.

import os
from openai import AzureOpenAI
    
client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),  
    api_version="2023-12-01-preview",
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT")
)

Anthropic

For Anthropic, you can use the base_url parameter.

from anthropic import Anthropic, DefaultHttpxClient

client = Anthropic(
    # Or use the `ANTHROPIC_BASE_URL` env var
    base_url="http://my.test.server.example.com:8083",
    http_client=DefaultHttpxClient(
        proxies="http://my.test.proxy.example.com",
        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
    ),
)

source : https://github.com/anthropics/anthropic-sdk-python?tab=readme-ov-file#configuring-the-http-client

Gemini

For Gemini, this is a bit more complicated.

from google.api_core.client_options import ClientOptions
import google.generativeai as genai
import PIL.Image
import os

def get_client_cert():
    # code to load client certificate and private key.
    return client_cert_bytes, client_private_key_bytes

options = ClientOptions(api_endpoint="foo.googleapis.com",
    client_cert_source=get_client_cert)

client = ImageAnnotatorClient(client_options=options)

genai.configure(api_key=os.environ["GOOGLE_API_KEY"], ClientOptions=options)
img = PIL.Image.open('path/to/image.png')

model = genai.GenerativeModel(model_name="gemini-1.5-flash")
response = model.generate_content(["What is in this photo?", img])
print(response.text)

Source = https://ai.google.dev/
Source = https://googleapis.dev/python/google-api-core/latest/client_options.html

@AlexanderYastrebov
Copy link

See https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models

llm works with LiteLLM + extra-openai-models.yaml config:

- model_id: foobar
  model_name: bazqux
  api_base: https://mylitellm.example.org/v1/
  api_key_name: mykey

If your environment requires a tool to get api key you may use:

llm keys set mykey --value=$(get_token)

or define an alias:

alias llm='llm --key=$(get_token)'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants