Draft
Conversation
- Add services/ollamaAdapter.ts implementing AIAdapter using Ollama REST API - Add provider selection via AI_PROVIDER env var (gemini/ollama) - Add OLLAMA_BASE_URL and OLLAMA_MODEL env vars to vite.config.ts - Update components to use unified aiService export Co-authored-by: avila2026 <265126500+avila2026@users.noreply.github.com> Agent-Logs-Url: https://github.com/avila2026/site-MATERNIDADE/sessions/52d09ff9-5722-43af-ad0c-ef42a0274f89
Copilot
AI
changed the title
[WIP] Add Ollama installation for API usage
feat: add Ollama as alternative AI provider
Mar 20, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Integrates Ollama's local LLM API as a swappable backend alongside the existing Gemini provider. Zero new dependencies — uses native
fetchagainst Ollama's REST API.Changes
services/ollamaAdapter.ts— New adapter implementingAIAdapterviaPOST /api/chat, with the same caching, system instruction, and error handling patterns as Gemini. Image generation returnsnullgracefully (unsupported by Ollama).services/aiAdapter.ts— Provider selection at build time viaAI_PROVIDERenv var. New unifiedaiServiceexport replaces directGeminiAdapterusage.vite.config.ts— DefinesAI_PROVIDER,OLLAMA_BASE_URL,OLLAMA_MODELbuild-time env vars with sensible defaults.components/Assistant.tsx,ImageGenerator.tsx— ImportaiServiceinstead ofGeminiAdapter.Usage
Default behavior unchanged — without
AI_PROVIDER=ollama, Gemini is used as before.🔒 GitHub Advanced Security automatically protects Copilot coding agent pull requests. You can protect all pull requests by enabling Advanced Security for your repositories. Learn more about Advanced Security.