Closed
Description
We have an OpenAI proxy deployed within our organization. This proxy uses the same logic as the OpenAI API, but has a different endpoint. Currently, the EmbedderConfig has no option to provide a custom endpoint for OpenAI.
Current implementation
embedder_config=EmbedderConfig(embedding_provider='openai', embedding_api_key=xxxxxxxxxxxxx,embedding_model_name='custom_model_name')
which then creates the following client
@requires_dependencies(["openai"], extras="openai")
def get_client(self) -> "OpenAI":
from openai import OpenAI
return OpenAI(api_key=self.api_key.get_secret_value())
Suggested improvement
The ability to use an OpenAI client with a custom base_url.
embedder_config=EmbedderConfig(embedding_provider='openai', embedding_api_key=xxxxxxxxxxxxx,embedding_model_name='custom_model_name',
base_url='llmproxy.organization.com')
which then creates the following client
@requires_dependencies(["openai"], extras="openai")
def get_client(self) -> "OpenAI":
from openai import OpenAI
return OpenAI(api_key=self.api_key.get_secret_value(), base_url=self.base_url)
Metadata
Metadata
Assignees
Labels
No labels
Activity
dutchfarao commentedon Mar 2, 2025
#406