Environment
- KFP version:
backend:v2.3.0
frontend:v2.15.0
- KFP SDK version:
kfp sdk: 1.8.22
Steps to reproduce
1.Create a pipeline component that outputs scalar metrics following the [v1 SDK output viewer] documentation(https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/output-viewer/#v1-sdk-writing-out-metadata-for-the-output-viewers), writing mlpipeline-metrics artifact in the v1 format:
2.Run the pipeline successfully — scalar metrics are recorded as expected.
3.Navigate to "Compare Runs" in the Kubeflow Pipelines UI and select multiple runs to compare.
4.Open the Metrics tab to compare scalar metrics across runs.
5.Observe in the browser developer console that the UI makes a request to /api/metrics (or similar endpoint), which returns 405 Method Not Allowed.
Expected result
The Compare Runs Metrics tab should display and compare scalar metrics (e.g., accuracy, loss) across selected runs, as it did in KFP v1.
Materials and reference
Labels
Impacted by this bug? Give it a 👍.
Environment
backend:v2.3.0
frontend:v2.15.0
kfp sdk: 1.8.22
Steps to reproduce
1.Create a pipeline component that outputs scalar metrics following the [v1 SDK output viewer] documentation(https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/output-viewer/#v1-sdk-writing-out-metadata-for-the-output-viewers), writing mlpipeline-metrics artifact in the v1 format:
2.Run the pipeline successfully — scalar metrics are recorded as expected.
3.Navigate to "Compare Runs" in the Kubeflow Pipelines UI and select multiple runs to compare.
4.Open the Metrics tab to compare scalar metrics across runs.
5.Observe in the browser developer console that the UI makes a request to /api/metrics (or similar endpoint), which returns 405 Method Not Allowed.
Expected result
The Compare Runs Metrics tab should display and compare scalar metrics (e.g., accuracy, loss) across selected runs, as it did in KFP v1.
Materials and reference
Labels
Impacted by this bug? Give it a 👍.