Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
ac95486
feat(benchmarks): cold-start measurement harness
ogabrielluiz Apr 20, 2026
19eebfa
feat(components): component index correctness + build caching
ogabrielluiz Apr 20, 2026
620fbfe
[autofix.ci] apply automated fixes
autofix-ci[bot] Apr 20, 2026
f52b6c5
feat(imports): defer heavy imports off the Graph hot path
ogabrielluiz Apr 20, 2026
68826bc
[autofix.ci] apply automated fixes (attempt 2/3)
autofix-ci[bot] Apr 20, 2026
ec926ed
refactor(imports): rename tests + strip internal planning vocabulary
ogabrielluiz Apr 20, 2026
e18a461
[autofix.ci] apply automated fixes
autofix-ci[bot] Apr 20, 2026
236fd9f
fix(custom): lazy exec_globals in validate.prepare_global_scope
ogabrielluiz Apr 20, 2026
7cba56b
feat(services): service init restructuring + container fork safety
ogabrielluiz Apr 20, 2026
c21cdb5
[autofix.ci] apply automated fixes
autofix-ci[bot] Apr 20, 2026
01828a0
refactor(services): rename tests + strip internal planning vocabulary
ogabrielluiz Apr 20, 2026
03c73a5
refactor(services): finish rename + strip remaining internal vocab
ogabrielluiz Apr 20, 2026
ab35692
docs(deployment): cold-start optimization guide + release notes
ogabrielluiz Apr 20, 2026
43feb33
refactor(docs): strip internal planning vocabulary
ogabrielluiz Apr 20, 2026
cb9ecca
fix(benchmarks): use TCP probe for langflow_run_http_ready readiness
ogabrielluiz Apr 20, 2026
7ad386c
[autofix.ci] apply automated fixes
autofix-ci[bot] Apr 20, 2026
1f894f7
fix(benchmarks): guard langflow_run supervisor against port squatters
ogabrielluiz Apr 20, 2026
bbe507a
fix(benchmarks): anchor langflow_run_http_ready baseline, gate regres…
ogabrielluiz Apr 22, 2026
54452bb
feat(benchmarks): skip verify for unanchored baselines (runs=0)
ogabrielluiz Apr 22, 2026
ba7b5aa
Update comments
jordanrfrazier May 4, 2026
92036db
[autofix.ci] apply automated fixes
autofix-ci[bot] May 4, 2026
1ee15a4
fix(lfx): harden lazy langchain imports for cold start and preload
jordanrfrazier May 4, 2026
406b8a0
fix(components): address review findings on cache index loader
jordanrfrazier May 4, 2026
08c086e
test(components): trim correctness tests to one per fix
jordanrfrazier May 4, 2026
5dd306a
test(components): merge correctness tests into test_component_index.py
jordanrfrazier May 4, 2026
99e453b
readd pandas roundtrip json
jordanrfrazier May 5, 2026
36a0d44
fix(custom): address review of lazy validate.prepare_global_scope
jordanrfrazier May 5, 2026
786eef8
ref: field type splits - organization only (#12983)
jordanrfrazier May 6, 2026
e9c9ef4
Merge release-1.10.0 into cold-start/01-measurement-foundation
ogabrielluiz May 6, 2026
c1a78e1
Merge cold-start/01-measurement-foundation into cold-start/02-compone…
ogabrielluiz May 6, 2026
abe6b7a
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
d71394f
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
fac7d16
Merge cold-start/02-component-index into cold-start/03-import-deferrals
ogabrielluiz May 6, 2026
537b12c
Merge cold-start/03-import-deferrals into cold-start/04-imp-11-lazy-v…
ogabrielluiz May 6, 2026
f99425c
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
3a048fb
Merge cold-start/04-imp-11-lazy-validate into cold-start/05-service-i…
ogabrielluiz May 6, 2026
3b99758
Merge cold-start/05-service-init-container into cold-start/06-docs-pu…
ogabrielluiz May 6, 2026
9be830a
Merge cold-start/05-service-init-container into cold-start/fix-http-r…
ogabrielluiz May 6, 2026
9fa3367
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
698494e
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
fcef8a1
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
ccce26f
Synthesize gate-based worker init with parallelized master preload
ogabrielluiz May 6, 2026
a742ee2
Merge cold-start/05-service-init-container (synthesis) into cold-star…
ogabrielluiz May 6, 2026
57a63a7
Synthesize gate-based worker init with parallelized master preload
ogabrielluiz May 6, 2026
0192b5f
Fix tests
erichare May 6, 2026
5747b92
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
e9af6a1
fix: Include the post fork function
erichare May 6, 2026
71c0c95
Merge branch 'cold-start/05-service-init-container' into cold-start/f…
erichare May 6, 2026
baab559
[autofix.ci] apply automated fixes
autofix-ci[bot] May 6, 2026
9d1ccf6
fix: implicit import in test
erichare May 6, 2026
8921cd5
Merge cold-start/02-component-index
ogabrielluiz May 6, 2026
94d2d07
Merge cold-start/03-import-deferrals
ogabrielluiz May 6, 2026
2ece3a3
Merge cold-start/04-imp-11-lazy-validate
ogabrielluiz May 6, 2026
8fb3a15
Merge cold-start/05-service-init-container
ogabrielluiz May 6, 2026
44b22c9
Merge cold-start/06-docs-publication
ogabrielluiz May 6, 2026
0f2b207
Merge cold-start/fix-http-ready-tcp-probe
ogabrielluiz May 6, 2026
2704117
Fix cold-start docs link to use relative .mdx path
ogabrielluiz May 6, 2026
034eccb
Drop public cold-start docs page
ogabrielluiz May 6, 2026
629c8ab
test: delete stale test_main_superuser_init
jordanrfrazier May 8, 2026
c6adefd
Revert "test: delete stale test_main_superuser_init"
jordanrfrazier May 8, 2026
11c4f2c
Reapply "test: delete stale test_main_superuser_init"
jordanrfrazier May 8, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
452 changes: 452 additions & 0 deletions .github/workflows/cold-start-benchmark.yml

Large diffs are not rendered by default.

638 changes: 379 additions & 259 deletions .secrets.baseline

Large diffs are not rendered by default.

43 changes: 42 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
.PHONY: all init format_backend format lint build run_backend dev help tests coverage clean_python_cache clean_npm_cache clean_frontend_build clean_all run_clic load_test_setup load_test_setup_basic load_test_list_flows load_test_run load_test_langflow_quick load_test_stress load_test_example load_test_clean load_test_remote_setup load_test_remote_run load_test_help docs docs_build docs_install api_examples_local api_examples_local_syntax
.PHONY: all init format_backend format lint build run_backend dev help tests coverage clean_python_cache clean_npm_cache clean_frontend_build clean_all run_clic load_test_setup load_test_setup_basic load_test_list_flows load_test_run load_test_langflow_quick load_test_stress load_test_example load_test_clean load_test_remote_setup load_test_remote_run load_test_help docs docs_build docs_install api_examples_local api_examples_local_syntax bench-local bench-docker bench-snapshot bench-verify-synthetic

# Configurations
VERSION=$(shell grep "^version" pyproject.toml | sed 's/.*\"\(.*\)\"$$/\1/')
Expand Down Expand Up @@ -41,6 +41,47 @@ check_tools:
@command -v npm >/dev/null 2>&1 || { echo >&2 "$(RED)NPM is not installed. Aborting.$(NC)"; exit 1; }
@echo "$(GREEN)All required tools are installed.$(NC)"

######################
# BENCHMARKS (Phase 1: cold-start measurement)
######################

# See src/backend/tests/benchmarks/README.md for rationale and usage.
#
# - bench-local: fast iteration against dev venv; NOT comparable to CI or baseline.
# - bench-docker: authoritative; runs inside python:3.13-slim with fresh container per measurement.
# - bench-snapshot: one-shot; captures a baseline and overwrites src/backend/tests/benchmarks/thresholds.json.
#
# All targets delegate to src/backend/tests/benchmarks/driver.py.

bench-local: ## run cold-start harness against dev venv (fast iteration; NOT authoritative)
@echo "$(YELLOW)bench-local:$(NC) running harness against dev venv (no cold-cache prep)"
@echo "$(YELLOW) NOTE:$(NC) these numbers are NOT comparable to CI or the committed baseline."
@uv run python -m src.backend.tests.benchmarks.driver --mode local

bench-docker: ## run cold-start harness inside python:3.13-slim (authoritative)
@echo "$(GREEN)bench-docker:$(NC) building measurement image and running harness"
@command -v $(DOCKER) >/dev/null 2>&1 || { echo >&2 "$(RED)$(DOCKER) is not installed. Aborting.$(NC)"; exit 1; }
@uv run python -m src.backend.tests.benchmarks.driver --mode docker

bench-snapshot: ## one-shot: capture authoritative baseline and overwrite thresholds.json
@echo "$(GREEN)bench-snapshot:$(NC) capturing authoritative baseline + writing thresholds.json"
@echo "$(YELLOW) NOTE:$(NC) authoritative numbers require a Linux GHA runner; local captures are for iteration only."
@uv run python -m src.backend.tests.benchmarks.snapshot

bench-verify-synthetic: ## MEAS-08 prove the CI gate trips on a synthetic regression
@echo "$(YELLOW)bench-verify-synthetic:$(NC) injecting synthetic regression into lfx/_bench.py"
@cp src/lfx/src/lfx/_bench.py src/lfx/src/lfx/_bench.py.orig.bak
@trap 'mv src/lfx/src/lfx/_bench.py.orig.bak src/lfx/src/lfx/_bench.py 2>/dev/null || true' EXIT HUP INT TERM ; \
awk 'BEGIN{done=0} {print} /^from __future__ / && !done { print "import time as _bench_synth_time"; print "_bench_synth_time.sleep(13.0)"; done=1 }' src/lfx/src/lfx/_bench.py.orig.bak > src/lfx/src/lfx/_bench.py ; \
$(DOCKER) build --build-arg BENCH_VARIANT=lean -t benchmarks-lean -f src/backend/tests/benchmarks/Dockerfile . >/dev/null ; \
$(DOCKER) rmi -f benchmarks-lean-uncompiled >/dev/null 2>&1 || true ; \
if uv run python -m src.backend.tests.benchmarks.driver --mode docker --verify --scenarios lfx_bare --output-dir /tmp/bench_synth ; then \
echo "$(RED)FAIL:$(NC) driver exited 0 despite synthetic regression. Gate is NOT wired correctly." ; \
exit 1 ; \
else \
echo "$(GREEN)PASS:$(NC) driver exited non-zero on synthetic regression. Gate is wired correctly." ; \
fi

help: ## show basic help message with common commands
@echo ''
@echo "$(GREEN)═══════════════════════════════════════════════════════════════════$(NC)"
Expand Down
13 changes: 13 additions & 0 deletions docs/docs/Support/release-notes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,19 @@ For all changes, see the [Changelog](https://github.com/langflow-ai/langflow/rel
The **Policies** component uses [ToolGuard](https://github.com/AgentToolkit/toolguard) to generate guard code from natural-language business policies and apply it to agent tools.
For more information, see [Policies (Beta)](../Components/policies.mdx).

- Cold-start performance improvements for `lfx run` and `langflow run`

Cold-start latency is reduced on the `lfx run` execution path and on `langflow run` restart scenarios. The gains come from deferred heavy imports on the Graph hot path, atomic and version-stamped component index, parallelized lifespan tasks, event-driven MCP startup, and a persisted component-index cache that skips the full package walk when the installed lfx version matches. The reference `langflowai/lfx` image also sets `UV_COMPILE_BYTECODE=1` at build time and uses a multi-stage layer layout so `.pyc` files are baked into the image layer.

Cold-start numbers (Linux CI, Python 3.13, `ubuntu-latest`):

| Scenario | Before (ms) | After (ms) | Delta |
|----------|-------------|------------|-------|
| `lfx run` bare boot (`lfx_bare`) | 17820 | 10550 | -7270ms (-40.8%) |
| `lfx run <flow>` uncompiled (`lfx_with_flow`) | 18920 | 16013 | -2907ms (-15.4%) |
| `lfx run <flow>` prebaked (`lfx_with_flow_prebaked`) | 9520 | 8425 | -1095ms (-11.5%) |
| `langflow run --backend-only` no-change restart | new scenario | 11325 | new in this release |

## 1.8.x

Highlights of this release include the following changes.
Expand Down
9 changes: 9 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,15 @@ dev = [
"mcp-server-fetch>=2025.1.17",
"onnxruntime>=1.20,<1.24" # >=1.24 does not support Python 3.10; <1.24 allows 1.23.x for agent-lifecycle-toolkit
]
benchmarks = [
# Cold-start measurement tooling. Installed only when `uv sync --group benchmarks`.
# Never a runtime dep of lfx or langflow (preserves the "no runtime deps in lfx" rule).
"pyinstrument>=5.1.2",
"importtime-convert>=1.1.0",
# importtime-waterfall latest on PyPI is 1.0.0; RESEARCH.md §Standard Stack lists
# 2.0.0 but that release is not available. Floor at 1.0.0 so the group resolves.
"importtime-waterfall>=1.0.0",
]

[[tool.uv.index]]
name = "pytorch-cpu"
Expand Down
2 changes: 0 additions & 2 deletions src/backend/base/langflow/field_typing/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@

from lfx.field_typing.constants import (
CUSTOM_COMPONENT_SUPPORTED_TYPES,
DEFAULT_IMPORT_STRING,
LANGCHAIN_BASE_TYPES,
# Import all the langchain types that may be needed
AgentExecutor,
Expand Down Expand Up @@ -57,7 +56,6 @@

__all__ = [
"CUSTOM_COMPONENT_SUPPORTED_TYPES",
"DEFAULT_IMPORT_STRING",
"LANGCHAIN_BASE_TYPES",
# Langchain types
"AgentExecutor",
Expand Down
19 changes: 1 addition & 18 deletions src/backend/base/langflow/initial_setup/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -1272,12 +1272,6 @@ async def get_or_create_default_folder(session: AsyncSession, user_id: UUID) ->

This implementation avoids an external distributed lock and works with both SQLite and PostgreSQL.

The function only creates a new default folder on first initialization (when the user has no
folders at all). If the user has already been through initial setup and has at least one folder
— even if they renamed the default or only kept other folders — the existing folder is returned
instead of creating a new "Starter Project". This prevents a phantom default folder from being
forced back into the UI every time the user logs in or the server restarts.

Args:
session (AsyncSession): The active database session.
user_id (UUID): The ID of the user who owns the folder.
Expand Down Expand Up @@ -1319,18 +1313,7 @@ async def get_or_create_default_folder(session: AsyncSession, user_id: UUID) ->
await session.rollback()
break

# Respect prior user intent: if the user already has folders (e.g. they renamed the
# default folder to something like "My Flows"), do not force a new "Starter Project" back
# into their UI on every login/server restart. Return any existing folder instead.
any_folder_stmt = (
select(Folder).where(Folder.user_id == user_id).order_by(Folder.id).limit(1) # type: ignore[arg-type]
)
any_folder = (await session.exec(any_folder_stmt)).first()
if any_folder:
return FolderRead.model_validate(any_folder, from_attributes=True)

# No existing folder found for this user — this is the first-time setup path.
# Create the default folder.
# If no existing folder found, create a new one
try:
folder_obj = Folder(user_id=user_id, name=DEFAULT_FOLDER_NAME, description=DEFAULT_FOLDER_DESCRIPTION)
session.add(folder_obj)
Expand Down
161 changes: 161 additions & 0 deletions src/backend/base/langflow/initial_setup/starter_project_hash.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,161 @@
"""starter-projects hash gate helpers.

Computes a content hash over the starter-project JSON files plus the installed
``lfx`` package version, and persists it as plaintext under
``${LANGFLOW_CONFIG_DIR}/starter_projects.hash``. ``main.py`` uses the hash to
short-circuit the full starter-project re-sync on restarts where nothing
changed.

Failure modes: missing, unreadable, or corrupt hash files all fall
through to a full re-sync. ``LANGFLOW_FORCE_STARTER_RESYNC=1`` bypasses the
comparison. Write failures (read-only root filesystem, e.g. container
deployments) log at debug level and never raise -- mirroring the
``update_project_file`` pattern at ``setup.py:690-701``.
"""

from __future__ import annotations

import hashlib
import os
from importlib.metadata import PackageNotFoundError, version
from typing import TYPE_CHECKING, Any

import aiofiles
from lfx.log.logger import logger

if TYPE_CHECKING:
from collections.abc import Awaitable, Callable
from pathlib import Path

import anyio

HASH_FILENAME = "starter_projects.hash"
_HEX_LEN = 64


async def compute_starter_projects_hash(starter_folder: anyio.Path) -> str:
"""Return a SHA256 hex digest over the starter folder contents + ``lfx`` version.

The digest updates with ``filename || NUL || file_bytes || NUL`` for each
``*.json`` file in ``starter_folder`` sorted by filename, followed by the
installed ``lfx`` package version string. Sorting is load-bearing: glob
order is not stable across filesystems.

If ``importlib.metadata.version("lfx")`` raises ``PackageNotFoundError``
(source-only checkout without ``pip install -e .``), the sentinel
``"unknown"`` is substituted (Pattern F / Pitfall 5). The hash remains
stable within such an environment but will invalidate whenever the
fallback fires in a fresh environment -- acceptable .
"""
try:
pkg_version = version("lfx")
except PackageNotFoundError:
pkg_version = "unknown"
hasher = hashlib.sha256()
paths = sorted(
[p async for p in starter_folder.glob("*.json")],
key=lambda p: p.name,
)
for path in paths:
hasher.update(path.name.encode("utf-8"))
hasher.update(b"\x00")
hasher.update(await path.read_bytes())
hasher.update(b"\x00")
hasher.update(pkg_version.encode("utf-8"))
return hasher.hexdigest()


async def read_hash_file_safe(hash_path: Path) -> str | None:
"""Return the stored SHA hex string, or ``None`` on any failure.

Skips comment lines (starting with ``#``) and returns the first line that
is exactly 64 lowercase hex characters. Returns ``None`` for:

- Missing file (``FileNotFoundError``)
- Unreadable file (``OSError`` / ``PermissionError``)
- Empty content
- Corrupt content (first non-comment line is not 64 hex chars)

The caller is expected to treat ``None`` as a cache miss and fall through
to a full re-sync.
"""
try:
async with aiofiles.open(str(hash_path), encoding="utf-8") as f:
content = await f.read()
except (OSError, FileNotFoundError):
return None
for raw_line in content.splitlines():
line = raw_line.strip()
if not line or line.startswith("#"):
continue
if len(line) == _HEX_LEN and all(c in "0123456789abcdef" for c in line):
return line
return None
return None


async def write_hash_file_safe(hash_path: Path, sha_hex: str, version_string: str) -> None:
"""Write ``sha_hex`` + a ``# version:`` comment line to ``hash_path``.

Ensures the parent directory exists (``mkdir(parents=True, exist_ok=True)``
on the parent). Swallows ``OSError`` (Pattern E) so that a read-only
filesystem -- common in containerized deployments with
``readOnlyRootFilesystem: true`` -- does not crash lifespan startup; the
hash gate simply falls through to a full re-sync on every restart in that
environment.
"""
content = f"{sha_hex}\n# version: {version_string}\n"
try:
hash_path.parent.mkdir(parents=True, exist_ok=True)
async with aiofiles.open(str(hash_path), "w", encoding="utf-8") as f:
await f.write(content)
except OSError as e:
await logger.adebug(
f"Could not write starter-projects hash file (read-only filesystem): {e}. "
"Hash gate will fall through to full re-sync on each restart."
)


def is_force_resync_requested() -> bool:
"""Return ``True`` if ``LANGFLOW_FORCE_STARTER_RESYNC`` is set to 1/true/yes.

Comparison is case-insensitive and whitespace-stripped. Any other value
(empty string, unset, "no", "0") returns ``False`` so the hash comparison
path runs normally.
"""
return os.getenv("LANGFLOW_FORCE_STARTER_RESYNC", "").strip().lower() in ("1", "true", "yes")


async def run_starter_projects_hash_gate(
*,
starter_folder: anyio.Path,
hash_path: Path,
sync_fn: Callable[[], Awaitable[Any]],
) -> bool:
"""Execute the hash-gated starter-project sync.

This helper encapsulates the hash compare / sync / write sequence so both
``main.py`` (inside its ``FileLock``) and the parity tests invoke
the exact same code path. ``sync_fn`` is a zero-arg coroutine factory the
caller uses to pass in ``create_or_update_starter_projects(all_types_dict)``
with ``all_types_dict`` already bound.

Returns ``True`` when the full re-sync ran (cache miss or force-resync),
``False`` when the hash matched and the sync was skipped.

The caller is responsible for wrapping this in its own ``FileLock`` /
error-handling context (TOCTOU safety per Pitfall 2). The gate itself
does not manage locking.
"""
expected = await compute_starter_projects_hash(starter_folder)
actual = await read_hash_file_safe(hash_path)
if is_force_resync_requested() or actual != expected:
await sync_fn()
try:
pkg_v = version("lfx")
except PackageNotFoundError:
pkg_v = "unknown"
await write_hash_file_safe(hash_path, expected, pkg_v)
return True
await logger.adebug("Starter projects hash matches; skipped re-sync")
return False
62 changes: 36 additions & 26 deletions src/backend/base/langflow/preload.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,20 +224,28 @@ async def _run_master_preload() -> None:
# would leave an open connection pool in the master process, and that pool
# would be inherited (fork-unsafe) by every worker.
try:
await logger.adebug("[preload] copying profile pictures")
await _best_effort(
PreloadStep.PROFILE_PICTURES,
"copy_profile_pictures failed",
copy_profile_pictures(),
# Wave 1: profile-picture copy and bundle download/extract are independent
# (different filesystem subtrees, no shared state). Run them concurrently
# so the master pays max(profile, bundles) instead of profile + bundles.
async def _do_bundles() -> None:
temp_dirs, bundles_components_paths = await load_bundles_with_error_handling()
_STATE.temp_dirs = list(temp_dirs)
_STATE.bundles_components_paths = list(bundles_components_paths)
settings_service.settings.components_path.extend(bundles_components_paths)
mark_step_complete(PreloadStep.BUNDLES)

await logger.ainfo("[preload] copying profile pictures + loading bundles (parallel)")
await asyncio.gather(
_best_effort(
PreloadStep.PROFILE_PICTURES,
"copy_profile_pictures failed",
copy_profile_pictures(),
),
_do_bundles(),
)

await logger.ainfo("[preload] loading bundles")
temp_dirs, bundles_components_paths = await load_bundles_with_error_handling()
_STATE.temp_dirs = list(temp_dirs)
_STATE.bundles_components_paths = list(bundles_components_paths)
settings_service.settings.components_path.extend(bundles_components_paths)
mark_step_complete(PreloadStep.BUNDLES)

# Types cache scans settings.components_path, which the bundle step just
# extended. Must run sequentially after Wave 1.
await logger.ainfo("[preload] building component types cache")
await get_and_cache_all_types_dict(settings_service, get_telemetry_service())
mark_step_complete(PreloadStep.TYPES_CACHED)
Expand All @@ -257,28 +265,30 @@ async def _run_master_preload() -> None:
initialize_agentic_global_variables,
)

await logger.adebug("[preload] initializing agentic global variables")

# Globals and MCP config are explicitly independent (per the prerequisite
# DAG: MCP config can succeed even when globals failed). Each gets its
# own session_scope and its own _best_effort wrapper, so a failure in
# one does not abort the other inside the gather.
async def _run_agentic_globals() -> None:
async with session_scope() as session:
await initialize_agentic_global_variables(session)

await _best_effort(
PreloadStep.AGENTIC_GLOBALS,
"initialize agentic global variables failed",
_run_agentic_globals(),
)

await logger.adebug("[preload] auto-configuring agentic MCP server")

async def _run_agentic_mcp() -> None:
async with session_scope() as session:
await auto_configure_agentic_mcp_server(session)

await _best_effort(
PreloadStep.AGENTIC_MCP,
"auto-configure agentic MCP server failed",
_run_agentic_mcp(),
await logger.adebug("[preload] initializing agentic globals + MCP server (parallel)")
await asyncio.gather(
_best_effort(
PreloadStep.AGENTIC_GLOBALS,
"initialize agentic global variables failed",
_run_agentic_globals(),
),
_best_effort(
PreloadStep.AGENTIC_MCP,
"auto-configure agentic MCP server failed",
_run_agentic_mcp(),
),
)

await logger.adebug("[preload] loading flows from directory")
Expand Down
Loading
Loading