Llama Stack Agentic AI Workflow — An intelligent multi-agent chat application powered by LangGraph, Llama Stack, and MCP
This repository contains the source code for lamass123, an AI-powered chat application created from a Red Hat Developer Hub Software Template.
The application features:
| Feature | Description |
|---|---|
| 🤖 Multi-Agent Orchestration | Specialized agents for Legal, HR, Sales, Procurement, and Tech Support |
| 📚 RAG-Powered Responses | Context-aware answers using FAISS vector stores |
| ☸️ Kubernetes Integration | Real-time cluster introspection via MCP tools |
| 🛡️ Content Safety | Guardrails powered by Llama Guard |
| 🐙 GitHub Integration | Automatic issue creation for support tickets |
For documentation about using and extending this application, see the docs directory.
The documentation includes:
- Architecture overview
- Environment variables configuration
- Source code structure
- Deployment information
| Resource | Link |
|---|---|
| 📦 GitOps Repository | https://github.com/jrichter-rhtap/llamatest3111-gitops |
| 🎨 Streamlit UI | Deployed via OpenShift Route |
| 📊 ArgoCD | Check your ArgoCD dashboard for deployment status |
This application is deployed automatically via GitOps. To make changes:
- Modify the code in this repository
- Create a Pull Request to trigger the CI pipeline
- Merge to automatically deploy via ArgoCD
- Frontend: Streamlit
- AI Framework: Llama Stack + LangGraph
- Vector Store: FAISS
- Inference: vLLM / OpenAI
- Safety: Ollama (Llama Guard)
- GitOps: ArgoCD
- CI/CD: Tekton Pipelines
This application was generated from the Llama Stack Agentic AI Workflow Software Template.
Want to improve the template, report issues, or contribute new features? Visit the upstream repository:
👉 redhat-ai-dev/llama-stack-agentic-sample
Built with ❤️ using Red Hat Developer Hub