Skip to content

Conversation

@xsharov
Copy link

@xsharov xsharov commented Apr 10, 2025

This pull request introduces a proxy server that emulates the Ollama API but directs requests to OpenRouter instead. This feature is particularly useful for users who prefer not to rely on locally hosted LLMs. By running this proxy, they can seamlessly utilize remote models while retaining the familiar Ollama-compatible API.

@olegshulyakov
Copy link

@xsharov There is a proxy that converts OpenAI API to Ollama one if it is needed: https://github.com/talyguryn/openai-api-server-for-enchanted-app

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants