Skip to content

[backend] Pipelines with artifact inputs compile but fail on direct invocation #12555

@lucharo

Description

@lucharo

Environment

  • How did you deploy Kubeflow Pipelines (KFP)? Vertex AI Pipelines (+ MRE local)
  • KFP version: 2.13.0
  • KFP SDK version: kfp 2.13.0

Steps to reproduce

#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.9"
# dependencies = ["kfp==2.13.0"]
# ///
"""MRE: KFP pipelines with artifact inputs compile but can't be invoked directly."""
from kfp import compiler, dsl, local
from kfp.dsl import Dataset

@dsl.component
def process(data: dsl.Input[Dataset]) -> str:
    return "done"

@dsl.pipeline
def my_pipeline(input_data: Dataset) -> str:
    return process(data=input_data).output

if __name__ == "__main__":
    # Compilation works
    compiler.Compiler().compile(my_pipeline, "/tmp/pipeline.yaml")
    print("Compilation: SUCCESS")

    # Direct invocation fails
    local.init(runner=local.SubprocessRunner(use_venv=False))
    try:
        my_pipeline(input_data=Dataset(uri="gs://bucket/data"))
        print("Invocation: SUCCESS")
    except ValueError as e:
        print(f"Invocation: FAILED\n{e}")

Output:

Compilation: SUCCESS
Invocation: FAILED
Argument input_data must be a parameter type, such as str, int, float, bool, dict, and list. Artifact inputs are not supported.

Expected result

Pipelines with artifact inputs should be invocable directly, like components. Currently:

This forces users to either duplicate pipeline logic or use dsl.importer with string URIs, losing type safety.

Materials and Reference


Impacted by this bug? Give it a 👍.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions