-
Notifications
You must be signed in to change notification settings - Fork 323
Description
OpenRAG Version
0.3.2
Deployment Method
Docker (docker compose up -d)
Operating System
macOS 15 (Apple Silicon)
Python Version
3.13.12
Affected Area
Docker deployment, Onboarding, Chat
Bug Description
Following the Docker deployment docs exactly on a fresh clone produces a broken deployment. Onboarding completes (LLM + embedding provider initialized successfully), but the onboarding chat ("What is OpenRAG?") and all subsequent chat attempts fail with "Network error. Please check your connection and try again."
Root cause: The four Langflow flows (agent, ingestion, URL ingestion, nudges) are never imported into Langflow's database on fresh deployment. The latest Docker image (langflowai/openrag-backend:latest, built 2026-03-16) does not contain the ensure_flows_exist() function that exists in the current main branch source code. As a result:
- Langflow starts with an empty flow database
- The backend references flow IDs from
.env(e.g.,LANGFLOW_CHAT_FLOW_ID=1098eea1-...) but these flows don't exist in Langflow - All flow operations return HTTP 404:
Flow identifier 5488df7c-b93f-4f87-a446-b67028bc0813 not found - Sample document ingestion fails silently
- Chat streaming fails with
UnboundLocalError: cannot access local variable 'assistant_message'inagent.py:768 - Frontend shows "Network error"
Steps to Reproduce
- Clone the repo:
git clone https://github.com/langflow-ai/openrag.git && cd openrag uv synccp .env.example .envand configure required variables (OPENSEARCH_PASSWORD, API keys, LLM/embedding provider)- Start Docling:
uv run python scripts/docling_ctl.py start --port 5001 docker compose up -d- Open
http://localhost:3000, complete onboarding - Chat returns "Network error"
Expected Behavior
After completing onboarding, the chat should work. Flows should be automatically imported into Langflow on first deployment.
Actual Behavior
All four flows return 404 from Langflow. Backend logs show:
[ERROR] [flows_service.py:877] Error updating nudges flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating retrieval flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating ingest flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating url_ingest flow: Failed to get flow from Langflow: HTTP 404
Ingestion fails:
[ERROR] [langflow_file_service.py:199] [LF] Run failed
- body: {"detail":"Flow identifier 5488df7c-b93f-4f87-a446-b67028bc0813 not found"}
Chat fails:
[ERROR] [agent.py:789] Error in langflow chat stream: cannot access local variable 'assistant_message'
Workaround
Manually import the flows into Langflow via API, then restart the backend:
# Get Langflow auth token
TOKEN=$(curl -s -X POST http://localhost:7860/api/v1/login \
-d "username=admin&password=YOUR_PASSWORD" | python3 -c "import sys,json; print(json.load(sys.stdin)['access_token'])")
# Import all four flows
for flow in flows/openrag_agent.json flows/ingestion_flow.json flows/openrag_nudges.json flows/openrag_url_mcp.json; do
FLOW_ID=$(python3 -c "import json; print(json.load(open('$flow'))['id'])")
curl -s -X PUT "http://localhost:7860/api/v1/flows/$FLOW_ID" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d @"$flow"
done
# Restart backend to pick up flows
docker compose restart openrag-backendAfter this, you may also need to manually set API keys in the Langflow UI (Settings > Global Variables) if the chat flow's LLM component wasn't auto-configured.
Root Cause Analysis
The main branch source code (src/services/flows_service.py) contains ensure_flows_exist() which creates flows from JSON files on startup if they're missing. However, this function does not exist in the latest Docker image (langflowai/openrag-backend:latest built 2026-03-16). The Docker image appears to predate this fix.
Additionally, even in the source code, ensure_flows_exist() runs at the end of startup_tasks() (line ~1209), but onboarding-triggered ingestion can fire before startup_tasks completes, creating a race condition where ingestion attempts to use flows before they're created.
Relevant Logs
See above.
Screenshots
No response
Additional Context
- The flow JSON files exist in the repo (
flows/*.json) and are mounted into the Langflow container at/app/flows/ - The flow IDs in the JSON files match the IDs in
.env.example - The old
LANGFLOW_LOAD_FLOWS_PATHmechanism (which auto-loaded flows) appears to have been removed beforeensure_flows_existwas added to the Docker image
Checklist
- I have searched existing issues to ensure this bug hasn't been reported before.
- I have provided all the requested information.