-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Feature Request: Add Ollama API Support
Description
Currently the project supports some API for interactions. I'd like to request adding support for Ollama API as an alternative/local LLM option. This would give users more flexibility in choosing their preferred language model backend.
Expected Behavior
- Add configuration option for Ollama API endpoint (similar to existing OpenAI config)
- Implement equivalent API calls to Ollama
- Maintain backward compatibility with existing OpenAI implementation
- Add documentation for Ollama setup/usage
Why is this feature important?
- Allows users to run local/self-hosted models via Ollama
- Provides an open-source alternative to proprietary APIs
- Could reduce API costs for some use cases
- Expands the project's compatibility with different LLM options
Additional Context
The existing OpenAI implementation can serve as reference. Ollama's REST API is quite similar in concept but has some differences in:
- Authentication (if any)
- Request/response formats
- Model naming conventions
supreme-gg-gg
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request