Skip to content

Adds Ollama service#250

Open
RychidM wants to merge 1 commit intotailscale-dev:mainfrom
RychidM:add-ollama-service
Open

Adds Ollama service#250
RychidM wants to merge 1 commit intotailscale-dev:mainfrom
RychidM:add-ollama-service

Conversation

@RychidM
Copy link
Copy Markdown

@RychidM RychidM commented Apr 7, 2026

Pull Request Title: Add Ollama Service

Description

Adds a Tailscale sidecar configuration for Ollama, a tool for running large language models (LLMs) locally. This lets users access their local models securely from any device on their Tailnet — including phones and remote machines — without exposing the Ollama API to the public internet.

Includes:

  • compose.yaml following the ScaleTail sidecar pattern (network_mode: service:tailscale, health checks, depends_on with health condition)
  • .env template with SERVICE, IMAGE_URL, TS_AUTHKEY, and an optional OLLAMA_API_KEY
  • README.md covering prerequisites, volumes, MagicDNS/HTTPS setup, optional LAN port exposure, and first-run model pull instructions
  • Optional yourNetwork external network on the Tailscale container, enabling other containers (e.g. Open WebUI) to reach Ollama via inter-container networking

Related Issues

N/A

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Refactoring

How Has This Been Tested?

  1. Ran docker compose config — no errors or missing variable warnings.
  2. Pre-created bind-mount directories (config, ts/state, ollama-data) and started the stack with docker compose up -d.
  3. Confirmed both containers reached healthy status via docker compose ps.
  4. Verified Tailscale node registration and Tailnet IP with docker exec tailscale-ollama tailscale status.
  5. Pulled tinyllama and sent a test generation request via curl to the Tailnet IP on port 11434 — received a valid JSON response.
  6. Repeated the curl test from a second device on the same Tailnet to confirm remote access works end-to-end.
  7. Verified tailscale-ollama appears in docker network inspect yourNetwork.

Checklist

  • I have performed a self-review of my code
  • I have added tests that prove my fix or feature works
  • I have updated necessary documentation (e.g. frontpage README.md)
  • Any dependent changes have been merged and published in downstream modules

Screenshots (if applicable)

N/A — no visual UI changes.

Additional Notes

  • The yourNetwork network is optional. Users who don't need inter-container communication can remove the networks: sections from compose.yaml entirely.
  • Ollama models can be large (several GB each). The README notes this under the Volumes section so users are aware before first pull.
  • OLLAMA_KEEP_ALIVE is set to 24h by default to keep models warm; users can adjust or remove this to suit their hardware.

@RychidM RychidM changed the title Adds Ollama service configuration with Docker Compose Adds Ollama service Apr 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant