A Pipeline-Native Computing Architecture: from Software-as-Application to Software-as-Pipeline.
Every computing model rests on a definition of what software is.
For decades, that definition has been: Software = Application — a monolithic unit that receives input, hides its internal state, and produces output. Users interact through predefined interfaces. Programmers mediate between human intent and machine execution.
Holonomy proposes a different definition:
Software = Pipeline — a sequence of explicit, contract-validated state transitions in which every step is inspectable, every failure is recoverable, and execution correctness is guaranteed by the runtime, not assumed by the programmer.
This shift is not incremental. It changes where trust lives, where state lives, and what the runtime is responsible for.
Two forces are converging to make this shift both necessary and possible.
LLMs can now translate human intent into execution plans. The gap between "what a user wants" and "a machine-executable specification" is closing. For the first time, software can be driven by intent rather than by programmer-written code.
But intent-driven execution is not trustworthy by default. LLM planners are non-deterministic. They hallucinate. They produce plans that look correct but corrupt state when executed. Current agent frameworks (LangGraph, AutoGPT, CrewAI) execute these plans without any formal validation boundary — and when something goes wrong, there is no principled recovery.
The missing piece is a runtime that guarantees execution, independent of what the planner produces.
That is Holonomy.
Holonomy is a full runtime architecture. Its computing model:
User Intent
↓
AI Orchestrator (Intent → Pipeline)
↓
Holonomy Runtime (Pipeline → Verified Execution)
↓
External Services (APIs, DBs, file systems)
The Holonomy Runtime consists of seven layers:
Holonomy Runtime
│
├── Pipeline Graph Engine DAG representation, dependency resolution, execution ordering
├── Contract Layer Per-node input/output contracts, fail-stop on violation
├── State Manager Explicit state at every step, checkpoint, recovery
├── Scheduler Dependency-aware scheduling, human-in-the-loop support
├── AI Orchestrator Intent → Pipeline generation (LLM-driven)
├── Connector Layer External services as typed pipeline nodes
└── Security & Sandbox Capability-based access, node isolation, information flow
The fundamental guarantee:
When a user expresses intent, the Holonomy runtime ensures that what gets executed is safe, consistent, and recoverable — regardless of what the planner produces.
This repository contains the formal kernel of the Holonomy Runtime — the minimal trusted core that makes the entire architecture verifiable.
Full Holonomy Architecture
│
├── AI Orchestrator (future)
├── Connector Ecosystem (future)
├── Security & Sandbox (partial)
│
└── Holonomy Core Runtime ◄── THIS REPO
│
├── Pipeline Graph Engine
├── Contract Layer
├── State Manager
└── Scheduler (core)
The Core Runtime is the trust boundary: the layer that separates what the planner proposes from what the system executes. It is formally specified, formally proved, and provided as an executable prototype.
The paradigm claim — Software = Pipeline — is only credible if the execution layer can actually be proved correct.
The Holonomy Core Runtime demonstrates this. It shows that a contract-validated, fail-stop, checkpoint-recovering pipeline execution engine can be formally specified and formally proved. The vision is not a slogan. The kernel works.
State: a partial map
Node: a 5-tuple
| Field | Meaning |
|---|---|
| Required input keys | |
| Produced output keys | |
| Input contract | |
| State transformation | |
| Output contract |
Pipeline: a finite DAG
Execution Rule — six steps, all must pass before commit:
1. Input key check I_n ⊆ dom(S)
2. Input contract C_n^in(S) = 1
3. Compute S* = f_n(S)
4. Output key check O_n ⊆ dom(S*)
5. Output contract C_n^out(S*) = 1
6. Commit S ← S*
Fail-stop: any failure → halt, S unchanged
| # | Theorem | Guarantee |
|---|---|---|
| 1 | Termination | Finite DAG → execution halts in finite steps |
| 2 | Contract Safety | Every committed state satisfies its node's output contract |
| 3 | Fail-Stop Integrity | No partial state mutation on failure |
| 4 | Checkpoint Recovery | Resume from checkpoint yields identical final state |
| 5 | Conditional Determinism | Conflict-free, deterministic nodes → unique final state |
| 6 | Path Dependence | Non-commutative transforms → order-dependent results |
| 7 | Scheduler Complexity |
|
| 8 | Suspension Safety | WaitNodes preserve DAG structure; resume is correct |
| 9 | Capability Locality | Nodes cannot access resources outside their granted set |
Formal proofs: proof.md — Paper: paper.md
| System | What it does | What it lacks |
|---|---|---|
| Airflow / Prefect | DAG scheduling | No per-node contracts, no fail-stop |
| LangGraph / CrewAI | Agent graph execution | No formal semantics, no recovery |
| Temporal.io | Durable execution | No node-level contracts, engineering guarantees only |
| ExoFlow (OSDI 2023) | Exactly-once DAG | No typed node contracts |
| Dagster Asset Checks | Runtime data checks | No formal model, no proofs |
The gap: no existing system formally specifies and proves an execution kernel under the assumption that the planning layer is untrustworthy.
Holonomy is a concept from differential geometry: when a vector is parallel-transported along a closed loop on a curved surface, it returns rotated — the result depends on the path taken, not just the endpoints.
The same principle applies here. In a pipeline, the same start and end states can be reached by different execution paths with different intermediate effects. Path dependence is a first-class property of the system (Theorem 6), not an edge case to be eliminated.
The name reflects the architecture's core insight: execution is not just about where you start and end. It is about every step along the way.
This repository proves the properties above within the formal model. It does not claim:
- AI planner correctness (above the trust boundary)
- Full sandbox security (no process/container isolation yet)
- Global determinism under write conflicts
- Recovery with non-idempotent side effects
- Production connector reliability
holonomy/
├── runtime/
│ ├── types.py State, Predicate, Transform, exceptions
│ ├── core.py Node, ExecRecord, ResumeToken, RunResult, PipelineRuntime
│ └── helpers.py Contract and node construction utilities
├── tests/
│ ├── test_theorems.py Theorem 1–9 executable witnesses
│ └── test_edge_cases.py Boundary conditions, counterexamples, stress tests
├── examples/
│ ├── demo_basic.py Basic pipeline execution
│ └── demo_wait_resume.py Human-in-the-loop suspend/resume
├── proof.md Formal model and theorem proofs
├── paper.md Workshop position paper draft
└── main.py Run all checks
Requirements: Python >= 3.8, no third-party dependencies.
python main.pyAll theorems pass. All edge cases pass. Demos complete.
| Phase | Goal | Status |
|---|---|---|
| 0 | Core Runtime — formal kernel, proofs, executable witnesses | Done |
| 1 | Freeze Layer 0 API — Node, RunResult, execution rule | Next |
| 2 | Static validation — DAG well-formedness, conflict detection | Planned |
| 3 | Recovery hardening — effect log, failure injection | Planned |
| 4 | Connector Layer — typed node wrappers for external services | Planned |
| 5 | AI Orchestrator — intent → pipeline generation | Planned |
| 6 | Security & Sandbox — process/WASM isolation | Planned |
| 7 | Ecosystem — IDE, connector marketplace, pipeline libraries | Vision |
© 2026 smkgenesis. All Rights Reserved.
This repository is made publicly visible for academic reference and priority establishment only.
Permitted: Reading, citing, and referencing this work in academic publications.
Not permitted without explicit written permission:
- Reproduction or redistribution of any part of this repository
- Modification or creation of derivative works
- Commercial use of any kind
- Integration into other software systems or products
For licensing inquiries, contact via GitHub.