Open WebUI + LiteLLM Stack – Production-Ready* with Observability (LGTM), Qdrant, Traefik #18562
Replies: 2 comments 1 reply
-
|
great project @mhajder , ive been building something similar using nix. thanks for sharing! |
Beta Was this translation helpful? Give feedback.
-
|
Nice stack — the isolated network architecture (frontend vs internal backend) is the right approach. One area I'd suggest adding to the security layer: LLM-level input/output guardrails. The stack handles infrastructure security well (TLS, auth, network isolation), but there's a gap at the application layer — what's actually being sent to and received from the LLMs. In production multi-user environments, you want to catch:
Since you're already running LiteLLM as a proxy, you could add a guardrail layer there. LiteLLM has built-in guardrail hooks, and for more structured scanning, ClawMoat can plug in as middleware to validate inputs/outputs against configurable policies. The observability piece (Grafana dashboards for token usage/latency) is great — would be useful to also log guardrail violations so you can track attempted misuse alongside performance metrics. Thanks for sharing this — bookmarking it as a reference architecture. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey everyone!
I wanted to share a project I have been working on to make deploying Open Web UI + LiteLLM easier, more secure, and fully observable for production or serious homelab environments. I am using it as demo to testing AI pods.
I created openwebui-stack, a robust Docker Compose setup that orchestrates Open Web UI alongside a full suite of enterprise-grade dependencies.
What is it?
Unlike simple "all-in-one" deployments, this stack breaks services down for scalability and performance. It includes a pre-configured LGTM stack (Loki, Grafana, Tempo, Mimir) for full observability, Traefik for secure reverse proxying, and shared databases for persistent storage.
Technical Breakdown
The stack automatically configures and connects the following services:
Core & AI
Infrastructure & Data
Observability (LGTM Stack)
Key Features
How to use it
Clone the repo:
git clone https://github.com/mhajder/openwebui-stack.git cd openwebui-stackRun the setup script (generates .env secrets):
chmod +x scripts/*.sh ./scripts/setup.shSpin up the stack:
You can check out the repository here: https://github.com/mhajder/openwebui-stack
I would love to get your feedback on the architecture, especially regarding the OTel integration! PRs are welcome as always!
Beta Was this translation helpful? Give feedback.
All reactions