Skip to content

Conversation

@manaswinidas
Copy link
Contributor

@manaswinidas manaswinidas commented Dec 30, 2025

Description

  • Hide latency columns according to applied filters in the hardware configuration table
  • Validated Model card content changes according to performance filters toggle
  • Some mock data changes to test the latency filter changes
Screen.Recording.2025-12-30.at.7.38.43.PM.mov

How Has This Been Tested?

  • Switch on the toggle using the temp feature flag on the console. Switch on/off the toggle to see the content changes in the validated model cards
  • Go to performance insights tab and try changing the latency metric and perentile values and check whether the other columns are hidden when certain filters are applied

Merge criteria:

  • All the commits have been signed-off (To pass the DCO check)
  • The commits have meaningful messages
  • Automated tests are provided as part of the PR for major new functionalities; testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work.
  • Code changes follow the kubeflow contribution guidelines.
  • For first time contributors: Please reach out to the Reviewers to ensure all tests are being run, ensuring the label ok-to-test has been added to the PR.

If you have UI changes

  • The developer has added tests or explained why testing cannot be added.
  • Included any necessary screenshots or gifs if it was a UI change.
  • Verify that UI/UX changes conform the UX guidelines for Kubeflow.

@google-oss-prow google-oss-prow bot added size/XL and removed size/L labels Dec 31, 2025
Copy link
Contributor

@YuliaKrimerman YuliaKrimerman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job here ,2 small things

},
filterData,
filterOptions,
isValidated, // Only fetch if validated
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that you use useCatalogPerformanceArtifacts with performanceViewEnabled I think you should add isValidated && performanceViewEnabled, // Only fetch if validated AND toggle is ON here. This will ensure the req of "The /artifacts fetch per-card can be disabled when the toggle is off" and we will not make an unnecessary API call

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But we need the "View x benchmarks" hyperlink leading to performance insights tab of catalog details page even when the toggle is off for validated models

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

aha.. that's a good point, I was wrong to specify that we can avoid this fetch if the toggle is off I guess ☹️

Copy link
Contributor

@mturley mturley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small question and small nit inline.

@manaswinidas we'll need to make sure we follow up after the filters are implemented on the landing page to make these cards change which latency property is being shown based on the user's latency filter selection. Technically you already could do that here if you want since that filter can be set on the details page and is available in context when rendering the cards, but it may make sense to make it part of the scope of implementing the filters. Just don't want to lose track of that detail since it was scoped as part of this story.

},
"ttft_mean": {
MetadataDoubleValue: &openapi.MetadataDoubleValue{
DoubleValue: 35.48818160947744,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why change these values? This is what we get from the API (I believe) and we have logic in the frontend to round it to 2 decimal places, we want to make sure that still works.

Comment on lines 42 to 44
const percentileSuffix = activeLatencyField.split('_').pop();
// Build the matching TPS field name (e.g., 'tps_p90', 'tps_mean')
const matchingTpsField = `${tpsPrefix}_${percentileSuffix}`;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should be able to use the parseLatencyFieldName and getLatencyFieldName utils here instead of splitting and joining with _ directly.

@manaswinidas
Copy link
Contributor Author

manaswinidas commented Jan 7, 2026

@manaswinidas we'll need to make sure we follow up after the filters are implemented on the landing page to make these cards change which latency property is being shown based on the user's latency filter selection. Technically you already could do that here if you want since that filter can be set on the details page and is available in context when rendering the cards, but it may make sense to make it part of the scope of implementing the filters. Just don't want to lose track of that detail since it was scoped as part of this story.

Yes, I have that on my radar - I'm addressing that in the PR that adds the filters in the landing page

…FieldName and getLatencyFieldName util

Signed-off-by: manaswinidas <[email protected]>
@mturley
Copy link
Contributor

mturley commented Jan 7, 2026

/lgtm
/approve

@google-oss-prow google-oss-prow bot added the lgtm label Jan 7, 2026
@google-oss-prow
Copy link

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: mturley

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@google-oss-prow google-oss-prow bot merged commit ed7a5f3 into kubeflow:main Jan 7, 2026
25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants