Open
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Pull Request Title: Add Ollama Service
Description
Adds a Tailscale sidecar configuration for Ollama, a tool for running large language models (LLMs) locally. This lets users access their local models securely from any device on their Tailnet — including phones and remote machines — without exposing the Ollama API to the public internet.
Includes:
compose.yamlfollowing the ScaleTail sidecar pattern (network_mode: service:tailscale, health checks,depends_onwith health condition).envtemplate withSERVICE,IMAGE_URL,TS_AUTHKEY, and an optionalOLLAMA_API_KEYREADME.mdcovering prerequisites, volumes, MagicDNS/HTTPS setup, optional LAN port exposure, and first-run model pull instructionsyourNetworkexternal network on the Tailscale container, enabling other containers (e.g. Open WebUI) to reach Ollama via inter-container networkingRelated Issues
N/A
Type of Change
How Has This Been Tested?
docker compose config— no errors or missing variable warnings.config,ts/state,ollama-data) and started the stack withdocker compose up -d.healthystatus viadocker compose ps.docker exec tailscale-ollama tailscale status.tinyllamaand sent a test generation request viacurlto the Tailnet IP on port 11434 — received a valid JSON response.curltest from a second device on the same Tailnet to confirm remote access works end-to-end.tailscale-ollamaappears indocker network inspect yourNetwork.Checklist
README.md)Screenshots (if applicable)
N/A — no visual UI changes.
Additional Notes
yourNetworknetwork is optional. Users who don't need inter-container communication can remove thenetworks:sections fromcompose.yamlentirely.OLLAMA_KEEP_ALIVEis set to24hby default to keep models warm; users can adjust or remove this to suit their hardware.