Skip to content

[Bug]: Fresh Docker deployment fails — Langflow flows never imported, chat returns 'Network error' #1229

@tesfandiari1

Description

@tesfandiari1

OpenRAG Version

0.3.2

Deployment Method

Docker (docker compose up -d)

Operating System

macOS 15 (Apple Silicon)

Python Version

3.13.12

Affected Area

Docker deployment, Onboarding, Chat

Bug Description

Following the Docker deployment docs exactly on a fresh clone produces a broken deployment. Onboarding completes (LLM + embedding provider initialized successfully), but the onboarding chat ("What is OpenRAG?") and all subsequent chat attempts fail with "Network error. Please check your connection and try again."

Root cause: The four Langflow flows (agent, ingestion, URL ingestion, nudges) are never imported into Langflow's database on fresh deployment. The latest Docker image (langflowai/openrag-backend:latest, built 2026-03-16) does not contain the ensure_flows_exist() function that exists in the current main branch source code. As a result:

  1. Langflow starts with an empty flow database
  2. The backend references flow IDs from .env (e.g., LANGFLOW_CHAT_FLOW_ID=1098eea1-...) but these flows don't exist in Langflow
  3. All flow operations return HTTP 404: Flow identifier 5488df7c-b93f-4f87-a446-b67028bc0813 not found
  4. Sample document ingestion fails silently
  5. Chat streaming fails with UnboundLocalError: cannot access local variable 'assistant_message' in agent.py:768
  6. Frontend shows "Network error"

Steps to Reproduce

  1. Clone the repo: git clone https://github.com/langflow-ai/openrag.git && cd openrag
  2. uv sync
  3. cp .env.example .env and configure required variables (OPENSEARCH_PASSWORD, API keys, LLM/embedding provider)
  4. Start Docling: uv run python scripts/docling_ctl.py start --port 5001
  5. docker compose up -d
  6. Open http://localhost:3000, complete onboarding
  7. Chat returns "Network error"

Expected Behavior

After completing onboarding, the chat should work. Flows should be automatically imported into Langflow on first deployment.

Actual Behavior

All four flows return 404 from Langflow. Backend logs show:

[ERROR] [flows_service.py:877] Error updating nudges flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating retrieval flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating ingest flow: Failed to get flow from Langflow: HTTP 404
[ERROR] [flows_service.py:877] Error updating url_ingest flow: Failed to get flow from Langflow: HTTP 404

Ingestion fails:

[ERROR] [langflow_file_service.py:199] [LF] Run failed
  - body: {"detail":"Flow identifier 5488df7c-b93f-4f87-a446-b67028bc0813 not found"}

Chat fails:

[ERROR] [agent.py:789] Error in langflow chat stream: cannot access local variable 'assistant_message'

Workaround

Manually import the flows into Langflow via API, then restart the backend:

# Get Langflow auth token
TOKEN=$(curl -s -X POST http://localhost:7860/api/v1/login \
  -d "username=admin&password=YOUR_PASSWORD" | python3 -c "import sys,json; print(json.load(sys.stdin)['access_token'])")

# Import all four flows
for flow in flows/openrag_agent.json flows/ingestion_flow.json flows/openrag_nudges.json flows/openrag_url_mcp.json; do
  FLOW_ID=$(python3 -c "import json; print(json.load(open('$flow'))['id'])")
  curl -s -X PUT "http://localhost:7860/api/v1/flows/$FLOW_ID" \
    -H "Authorization: Bearer $TOKEN" \
    -H "Content-Type: application/json" \
    -d @"$flow"
done

# Restart backend to pick up flows
docker compose restart openrag-backend

After this, you may also need to manually set API keys in the Langflow UI (Settings > Global Variables) if the chat flow's LLM component wasn't auto-configured.

Root Cause Analysis

The main branch source code (src/services/flows_service.py) contains ensure_flows_exist() which creates flows from JSON files on startup if they're missing. However, this function does not exist in the latest Docker image (langflowai/openrag-backend:latest built 2026-03-16). The Docker image appears to predate this fix.

Additionally, even in the source code, ensure_flows_exist() runs at the end of startup_tasks() (line ~1209), but onboarding-triggered ingestion can fire before startup_tasks completes, creating a race condition where ingestion attempts to use flows before they're created.

Relevant Logs

See above.

Screenshots

No response

Additional Context

  • The flow JSON files exist in the repo (flows/*.json) and are mounted into the Langflow container at /app/flows/
  • The flow IDs in the JSON files match the IDs in .env.example
  • The old LANGFLOW_LOAD_FLOWS_PATH mechanism (which auto-loaded flows) appears to have been removed before ensure_flows_exist was added to the Docker image

Checklist

  • I have searched existing issues to ensure this bug hasn't been reported before.
  • I have provided all the requested information.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions