Skip to content

Fix displaying eval errors in jobset eval view#1470

Merged
Ericson2314 merged 1 commit intoNixOS:masterfrom
qowoz:eval-view
Apr 11, 2025
Merged

Fix displaying eval errors in jobset eval view#1470
Ericson2314 merged 1 commit intoNixOS:masterfrom
qowoz:eval-view

Conversation

@zowoq
Copy link
Copy Markdown
Contributor

@zowoq zowoq commented Apr 7, 2025

Closes #1453

Closes #1363 (was merged in #1462)

@Ericson2314
Copy link
Copy Markdown
Member

How does this fix it, out of curiousity?

@zowoq
Copy link
Copy Markdown
Contributor Author

zowoq commented Apr 10, 2025

How does this fix it, out of curiousity?

Did you read the commit message?

Quickfix for something that annoyed me once too often.

Specifically, I'm talking about `/eval/1#tabs-errors`.

To not fetch long errors on each request, this is only done on-demand.
I.e., when the tab is opened, an iframe is requested with the errors.
This iframe uses a template for both the jobset view and the jobset-eval
view. It is differentiated by checking if `jobset` or `eval` is defined.

However, the jobset-eval view also has a `jobset` variable in its stash
which means that in both cases the `if` path was used. Since
`jobset.fetcherrormsg` isn't defined in the eval case though, you always
got an empty error.

The band-aid fix is relatively simple: swap if and else: the `eval`
variable is not defined in the stash of the jobset view, so now this is
a useful condition to decide which view we're in.

(cherry picked from commit https://git.lix.systems/lix-project/hydra/commit/70c3d75f739b184b36908a2c898332444482d1a1)
@Ericson2314
Copy link
Copy Markdown
Member

Oh no I missed that, sorry. Reading now.

@Ericson2314 Ericson2314 added this pull request to the merge queue Apr 11, 2025
Merged via the queue into NixOS:master with commit 5f6b075 Apr 11, 2025
1 check passed
@zowoq zowoq deleted the eval-view branch April 11, 2025 23:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

"Evaluation Errors" tab is broken for individual evals

3 participants