Skip to content

Feature Request: Add Ollama API Support #58

@zhouhaoGG

Description

@zhouhaoGG

Feature Request: Add Ollama API Support

Description

Currently the project supports some API for interactions. I'd like to request adding support for Ollama API as an alternative/local LLM option. This would give users more flexibility in choosing their preferred language model backend.

Expected Behavior

  1. Add configuration option for Ollama API endpoint (similar to existing OpenAI config)
  2. Implement equivalent API calls to Ollama
  3. Maintain backward compatibility with existing OpenAI implementation
  4. Add documentation for Ollama setup/usage

Why is this feature important?

  • Allows users to run local/self-hosted models via Ollama
  • Provides an open-source alternative to proprietary APIs
  • Could reduce API costs for some use cases
  • Expands the project's compatibility with different LLM options

Additional Context

The existing OpenAI implementation can serve as reference. Ollama's REST API is quite similar in concept but has some differences in:

  • Authentication (if any)
  • Request/response formats
  • Model naming conventions

Ollama

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions