Skip to content

Conversation

@Dreichi
Copy link

@Dreichi Dreichi commented Sep 13, 2025

Add Ollama Local LLM Support

🚀 Features

  • Local LLM Integration: Added comprehensive Ollama support for local AI inference
  • Optional API Key: Ollama can work without API key for local deployments
  • Unified JSON Forcing: Created llmClient.ts for consistent JSON formatting across providers
  • Multi-Provider Support: Seamless switching between Gemini, OpenAI, and Ollama

🔧 Technical Implementation

  • New Services:
    • OllamaService: Full-featured service with all AI operations
    • llmClient.ts: Unified client with automatic JSON forcing (format: "json" for Ollama)
    • unifiedAIService.ts: Factory pattern for provider abstraction
  • UI Enhancement: Updated ConfigDialog with Ollama configuration options
  • Type Safety: Extended config store to support 'ollama' provider type
  • Smart Validation: Optional API key validation (required for Gemini/OpenAI, optional for Ollama)

🌍 Internationalization

  • Added translation keys for Ollama-specific UI elements
  • Support for both English and Chinese locales
  • Consistent labeling across all providers

🎉 Benefits for Users

  • 💰 Cost Reduction: Run AI locally without API costs
  • 🔒 Privacy: No data sent to external services
  • ⚡ Performance: Direct local inference without network latency
  • 🔧 Flexibility: Support for any Ollama-compatible model (llama2, mistral, etc.)
  • 🌐 Accessibility: Works offline once models are downloaded

🔄 Backward Compatibility

  • All existing Gemini and OpenAI integrations remain unchanged
  • No breaking changes to existing user configurations
  • Seamless migration path for current users

- Add OllamaService for local AI inference with llmClient integration
- Create unified llmClient.ts with automatic JSON forcing (format: 'json')
- Update ConfigDialog with Ollama provider option and optional API key
- Extend config store to support 'ollama' provider type
- Add unifiedAIService factory for seamless provider switching
- Update App.tsx to handle optional API key validation for Ollama
- Add translation keys for Ollama-specific UI elements (EN/ZH)
- Maintain backward compatibility with existing Gemini/OpenAI providers

Closes #[issue-number]
Copy link
Owner

@SSShooter SSShooter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of the code in src/services/ollamaService.ts is duplicated with src/services/geminiService.ts. It should only be necessary to extend the functionality of geminiService.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants