Skip to content

Support for proxies when talking to remote APIs by overriding the endpointΒ #518

Open
@sahuguet

Description

@sahuguet

Most serious companies block direct access to LLM apis (OpenAI, Anthropic, etc._ and mandate the use of an internal proxy.

Registering a new model by just specifying the key is not enough.
You should be able to provide the API key AND the end point to be used.

Library like curl. or requests are good examples in terms of support for proxies.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions