Skip to content

[MAINTENANCE] Remove test_expectations_v3_api.py #11098

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
May 6, 2025

Conversation

billdirks
Copy link
Contributor

  • Description of PR changes above includes a link to an existing GitHub issue
  • PR title is prefixed with one of: [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], [CONTRIB], [MINORBUMP]
  • Code is linted - run invoke lint (uses ruff format + ruff check)
  • Appropriate tests and docs have been updated

For more information about contributing, visit our community resources.

After you submit your PR, keep the page open and monitor the statuses of the various checks made by our continuous integration process at the bottom of the page. Please fix any issues that come up and reach out on Slack if you need help. Thanks for contributing!

Copy link

netlify bot commented Apr 14, 2025

Deploy Preview for niobium-lead-7998 canceled.

Name Link
🔨 Latest commit c64f4b1
🔍 Latest deploy log https://app.netlify.com/sites/niobium-lead-7998/deploys/6818fa6aba777c000899d141

Copy link

codecov bot commented Apr 14, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 80.31%. Comparing base (c2d9a48) to head (c64f4b1).
Report is 4 commits behind head on develop.

✅ All tests successful. No failed tests found.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop   #11098      +/-   ##
===========================================
- Coverage    81.20%   80.31%   -0.89%     
===========================================
  Files          502      502              
  Lines        41454    41443      -11     
===========================================
- Hits         33661    33284     -377     
- Misses        7793     8159     +366     
Flag Coverage Δ
3.10 70.17% <20.00%> (-0.29%) ⬇️
3.10 athena or openpyxl or pyarrow or project or sqlite or aws_creds ?
3.10 aws_deps ?
3.10 big ?
3.10 clickhouse ?
3.10 filesystem ?
3.10 mssql ?
3.10 mysql ?
3.10 postgresql ?
3.10 spark_connect ?
3.11 70.19% <20.00%> (-0.28%) ⬇️
3.11 athena or openpyxl or pyarrow or project or sqlite or aws_creds ?
3.11 aws_deps ?
3.11 big ?
3.11 clickhouse ?
3.11 filesystem ?
3.11 mssql ?
3.11 mysql ?
3.11 postgresql ?
3.11 spark_connect ?
3.12 70.19% <20.00%> (-0.30%) ⬇️
3.12 athena or openpyxl or pyarrow or project or sqlite or aws_creds 56.94% <20.00%> (-0.21%) ⬇️
3.12 aws_deps 46.11% <20.00%> (-0.75%) ⬇️
3.12 big 54.86% <20.00%> (-0.33%) ⬇️
3.12 bigquery 49.10% <20.00%> (-0.51%) ⬇️
3.12 databricks 50.82% <20.00%> (-0.49%) ⬇️
3.12 filesystem 62.87% <20.00%> (-0.42%) ⬇️
3.12 gx-redshift 49.04% <20.00%> (-0.51%) ⬇️
3.12 mssql 49.32% <20.00%> (-2.79%) ⬇️
3.12 mysql 49.58% <20.00%> (-2.79%) ⬇️
3.12 postgresql 52.63% <100.00%> (-2.56%) ⬇️
3.12 redshift 48.99% <20.00%> (-0.52%) ⬇️
3.12 snowflake 51.60% <20.00%> (-0.49%) ⬇️
3.12 spark 54.52% <20.00%> (-3.79%) ⬇️
3.12 spark_connect 46.60% <20.00%> (-0.58%) ⬇️
3.12 trino 48.15% <20.00%> (-4.62%) ⬇️
3.9 70.21% <20.00%> (-0.29%) ⬇️
3.9 athena or openpyxl or pyarrow or project or sqlite or aws_creds 56.93% <20.00%> (-0.21%) ⬇️
3.9 aws_deps 46.13% <20.00%> (-0.75%) ⬇️
3.9 big 54.87% <20.00%> (-0.33%) ⬇️
3.9 bigquery 49.09% <20.00%> (-0.51%) ⬇️
3.9 clickhouse 41.83% <20.00%> (-1.98%) ⬇️
3.9 databricks 50.82% <20.00%> (-0.49%) ⬇️
3.9 filesystem 62.87% <20.00%> (-0.42%) ⬇️
3.9 gx-redshift 49.03% <20.00%> (-0.50%) ⬇️
3.9 mssql 49.32% <20.00%> (-2.78%) ⬇️
3.9 mysql 49.57% <20.00%> (-2.77%) ⬇️
3.9 postgresql 52.63% <100.00%> (-2.55%) ⬇️
3.9 redshift 49.00% <20.00%> (-0.50%) ⬇️
3.9 snowflake 51.60% <20.00%> (-0.49%) ⬇️
3.9 spark 54.51% <20.00%> (-3.77%) ⬇️
3.9 spark_connect 46.61% <20.00%> (-0.58%) ⬇️
3.9 trino 48.17% <20.00%> (-4.59%) ⬇️
cloud 0.00% <0.00%> (ø)
docs-basic 54.34% <20.00%> (+0.01%) ⬆️
docs-creds-needed 53.21% <20.00%> (+0.01%) ⬆️
docs-spark 52.78% <20.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@billdirks billdirks marked this pull request as ready for review April 15, 2025 22:30
@billdirks billdirks force-pushed the m/gx-646/remove-v3-expect-tests branch from 7f572a0 to b5e7f62 Compare April 29, 2025 23:04
@@ -79,6 +79,10 @@ def test_event_identifiers(analytics_config):
assert "organization_id" not in properties


@pytest.mark.xfail(
Copy link
Contributor Author

@billdirks billdirks Apr 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test was only passing because other tests ran first setting up the context. This test always fails when run in isolation. I will write a ticket to followup fixing this.

@@ -0,0 +1,92 @@
{
Copy link
Contributor Author

@billdirks billdirks Apr 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I deleted all the json files but this one. It turns out we have some postgresql tests that depend on this table being created. It was previously created in the delete tests setup. I've updated this file to be postgresql only. I've deleted all the test info and just left the data. I moved the file to be next to the test module that uses it.

@@ -30,6 +42,30 @@
VALUES_ON_MY_FAVORITE_DAY = [8]


@pytest.fixture(scope="module", autouse=True)
def setup_module():
# setup table
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These tests rely on a table that used to be created and populated in the test_expectations_v3_api.py file. They would only pass if run after those tests and would fail if run in isolation. I've moved the table creation logic to this module as well as an edited version of that 1 json file that defined this table.

"w",
) as f:
json.dump(test_results, f, indent=2)
def test_expectations_using_expectation_definitions():
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the purpose of this test was to a specific render for all expectations that were defined using test_expectations_v3_api.py. Since that testing strategy was hard to maintained and didn't contain all the expectations, this strategy isn't ideal. For the purpose of this PR, I extracted all the expectations that were under test and moved them into BulletListContentBlock.json. This let me remove all the filtering and I can not iterate through them and run the asserts in this test. This should be equivalent to what was being run before.

Copy link
Contributor

@dctalbot dctalbot May 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do we ensure tests/render/BulletListContentBlock.json is always up to date?

and I can not iterate through them

s/not/now ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's already out of date since we've already moved away from this json testing framework. I think we should eventually delete this method and incorporate this test in our new expectation testing framework. I will file a followup ticket.
I think it's outside the scope of this work since it's currently falling out of date and this doesn't make it worse. I think it's a good callout to fix it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

)

# TODO: accommodate case where multiple datasets exist within one expectation test definition
with open(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've removed this since it doesn't seem to be testing anything. Maybe it tests that the results are json serializable? Every result should be json serializable because we have called to_json_dict above and unit tests for that behavior should go there. Also, we don't need to actually write a file to verify this.

Copy link
Contributor

@dctalbot dctalbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the refactor to test_expectations_using_expectation_definitions, but I'm curious if we have a way to generate tests/render/BulletListContentBlock.json and ensure it is always current.

@billdirks
Copy link
Contributor Author

I've added https://greatexpectations.atlassian.net/browse/GX-850 to track the block render test work. Adding this to the merge queue.

@billdirks billdirks added this pull request to the merge queue May 5, 2025
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks May 5, 2025
@billdirks billdirks added this pull request to the merge queue May 5, 2025
Merged via the queue into develop with commit 5aae981 May 6, 2025
71 of 74 checks passed
@billdirks billdirks deleted the m/gx-646/remove-v3-expect-tests branch May 6, 2025 00:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants