Replies: 22 comments 141 replies
-
Friday 2024-10-18Note that before we get to testing a release candidate for Unified, we'll need to get v3 test data on to Perlmutter and Compy too. On Chrysalis: # Set up branch
cd ez/zppy
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zppy_weekly_20241018 upstream/main
git log
# Last commit, from 10/15: Improve carryover dependency handling (#623)
# Set up diags to use
cd ../e3sm_diags
git checkout main
git fetch upstream
git reset --hard upstream/main
git log
# Last commit, from 10/15: make --closedcontourcmd resolution specific (#824)
mamba clean --all
# Press `y` twice
mamba env create -f conda-env/dev.yml -n e3sm_diags_20241018
conda activate e3sm_diags_20241018
pip install .
# Set up testing files
cd ../zppy
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_zppy_weekly_20241018"
# For get_chyrsalis_expansions: "diags_environment_commands": "source /home/ac.forsyth2/miniconda3/etc/profile.d/conda.sh; conda activate e3sm_diags_20241018",
# Keep this as is: generate_cfgs(unified_testing=False, dry_run=False)
# Set up zppy environment
mamba clean --all
mamba env create -f conda/dev.yml -n zppy_dev_weekly_20241018
conda activate zppy_dev_weekly_20241018
pip install .
# Run unit tests
python -u -m unittest tests/test_*.py
# All 22 tests pass
# Run zppy to produce actual results to compare in the integration tests
python tests/integration/utils.py
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# Once those all finish:
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# Once that finishes:
# Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20241018/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20241018/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_zppy_weekly_20241018/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Run integration tests
python -u -m unittest tests/integration/test_*.pyThis gets us these results: We can see the diff is from the formatting change introduced in #627, so this is expected. Let's update the expected results. That gives: ✅ This week's testing of |
Beta Was this translation helpful? Give feedback.
-
Friday 2024-10-25Commits since 2024-10-18:
# Set up branch
cd ~/ez/zppy
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zppy_weekly_20241025 upstream/main
git log
# Last commit, from 10/15: Improve carryover dependency handling (#623)
# Set up diags to use
cd ../e3sm_diags
git checkout main
git status
# Check no file changes will persist when we switch branches.
git fetch upstream
git reset --hard upstream/main
git log
# Last commit, from 10/23: Fixing African easterly wave density plots in TC analysis (#851)
mamba clean --all --y
mamba env create -f conda-env/dev.yml -n e3sm_diags_20241025
conda activate e3sm_diags_20241025
pip install .
# Set up testing files
cd ../zppy
# NOTE: CURRENTLY MANUAL STEP --could potentially pass in arguments to utils.py?
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_zppy_weekly_20241025"
# For get_chyrsalis_expansions: "diags_environment_commands": "source /home/ac.forsyth2/miniconda3/etc/profile.d/conda.sh; conda activate e3sm_diags_20241025",
# Keep this as is: generate_cfgs(unified_testing=False, dry_run=False)
# Set up zppy environment
mamba clean --all --y
mamba env create -f conda/dev.yml -n zppy_dev_weekly_20241025 # This command took a while
conda activate zppy_dev_weekly_20241025
pip install .
# Run unit tests
python -u -m unittest tests/test_*.py
# All 22 tests pass
# Run zppy to produce actual results to compare in the integration tests
python tests/integration/utils.py
# That prints:
# CFG FILES HAVE BEEN GENERATED FROM TEMPLATES WITH THESE SETTINGS:
# UNIQUE_ID=test_zppy_weekly_20241025
# unified_testing=False
# diags_environment_commands=source /home/ac.forsyth2/miniconda3/etc/profile.d/conda.sh; conda activate e3sm_diags_20241025
# e3sm_to_cmip_environment_commands=
# environment_commands=
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that those 3 zppy runs have finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once those all finish: (took about 1h15m)
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that that zppy run has finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once that finishes: (took about 10m)
# Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20241025/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
# e3sm_diags_lnd_monthly_mvm_lnd_model_vs_model_1987-1988_vs_1985-1986.status:ERROR (1)
cat e3sm_diags_lnd_monthly_mvm_lnd_model_vs_model_1987-1988_vs_1985-1986.o614397
# cp: cannot stat '/lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20241025/v3.LR.historical_0051/post/lnd/180x360_aave/clim/2yr/v3.LR.historical_0051_*_1987??_1988??_climo.nc': No such file or directory
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20241025/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# e3sm_diags_lnd_monthly_mvm_lnd_model_vs_model_1982-1983_vs_1980-1981.status:ERROR (1)
cat e3sm_diags_lnd_monthly_mvm_lnd_model_vs_model_1982-1983_vs_1980-1981.o614424
# cp: cannot stat '/lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20241025/v2.LR.historical_0201/post/lnd/180x360_aave/clim/2yr/v2.LR.historical_0201_*_1982??_1983??_climo.nc': No such file or directory
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_zppy_weekly_20241025/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Run integration tests
cd ~/ez/zppy
python -u -m unittest tests/integration/test_*.pyThat gives: The The Let's look at the image check failures: Let's try the directory paths: Clearly, image check failures aren't being routed to the right place. |
Beta Was this translation helpful? Give feedback.
-
Friday 2024-11-01Commits since 2024-10-25:
# Set up branch
cd ~/ez/zppy
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zppy_weekly_20241101 upstream/main
git log
# Last commit, from 10/30: Updates to diags handling (#633)
# Set up diags to use
cd ../e3sm_diags
git checkout main
git status
# Check no file changes will persist when we switch branches.
git fetch upstream
git reset --hard upstream/main
git log
# Last commit, from 10/29: add_TCO_60S60N to default lat-lon (#879)
mamba clean --all --y
mamba env create -f conda-env/dev.yml -n e3sm_diags_20241101
conda activate e3sm_diags_20241101
pip install .
# Set up testing files
cd ../zppy
# NOTE: CURRENTLY MANUAL STEP --could potentially pass in arguments to utils.py?
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_zppy_weekly_20241101"
# For get_chyrsalis_expansions: "diags_environment_commands": "source /home/ac.forsyth2/miniconda3/etc/profile.d/conda.sh; conda activate e3sm_diags_20241101",
# Keep this as is: generate_cfgs(unified_testing=False, dry_run=False)
# Set up zppy environment
mamba clean --all --y
mamba env create -f conda/dev.yml -n zppy_dev_weekly_20241101 # This command took a while
conda activate zppy_dev_weekly_20241101
pip install .
# Run unit tests
python -u -m unittest tests/test_*.py
# All 23 tests pass (increased from 22)
# Run zppy to produce actual results to compare in the integration tests
python tests/integration/utils.py
# That prints:
# CFG FILES HAVE BEEN GENERATED FROM TEMPLATES WITH THESE SETTINGS:
# UNIQUE_ID=test_zppy_weekly_20241101
# unified_testing=False
# diags_environment_commands=source /home/ac.forsyth2/miniconda3/etc/profile.d/conda.sh; conda activate e3sm_diags_20241101
# e3sm_to_cmip_environment_commands=
# environment_commands=
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that those 3 zppy runs have finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once those all finish: (took about 1h15m)
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that that zppy run has finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once that finishes: (took about 10m)
# Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20241101/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20241101/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_zppy_weekly_20241101/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# No errors
# Run integration tests
cd ~/ez/zppy
python -u -m unittest tests/integration/test_*.pyThat gives: The diff is from the template file change in https://github.com/E3SM-Project/zppy/pull/633/files#diff-fbd1002ca5f9034c321e896884792a69fc7e4a9fc14e826a3954dc81a3d6d76d, so this is expected. Let's update the expected results. That test passes now. As for the image check failures (attempted to format this hierarchically, but only semi-successfully):
So, all image check failures are accounted for. Let's update the expected results. After about 20 minutes, that gives: All expected results have been updated. ✅ This week's testing of |
Beta Was this translation helpful? Give feedback.
-
Friday 2024-11-22This is the first Weekly test since the merging of major refactor splitting zppy into zppy itself (workflow coordination) and zppy-interfaces ("last-mile" code improving E3SM-specific usability of external packages). Check commit historyCommits since 2024-11-01 (last Weekly test of zppy:
However, I'm going to conciously test Set up branch# Set up branch
cd ~/ez/zppy
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zppy_weekly_20241122 upstream/main
git log
# Last commit, from 11/22: zppy-interfaces refactor (#642)Set up zppy-interfaces envcd ~/ez/zppy-interfaces
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zi_weekly_20241122 upstream/main
git log
# Last commit, from 11/22: Initial implementation (#1)
conda clean --all --y
conda env create -f conda/dev.yml -n zi_dev_weekly_20241122
conda activate zi_dev_weekly_20241122
pip install .Set up e3sm_diags envcd ~/ez/e3sm_diags
git status
# Check no file changes will persist when we switch branches.
git fetch upstream
git checkout main
git reset --hard upstream/main
git log
# Last commit, from 11/20: Release 2.12.1 (#893)
# NOTE: As mentioned earlier, let's use the Diags code that was used for the last testing
git reset --hard 3f5b03627aef703b4f7d8463cd5faed3c0806566
git log
# Last commit, from 10/29: add_TCO_60S60N to default lat-lon (#879)
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm_diags_from_20241029
conda activate e3sm_diags_from_20241029
pip install .Set up zppy envcd ~/ez/zppy
conda clean --all --y
conda env create -f conda/dev.yml -n zppy_dev_weekly_20241122
conda activate zppy_dev_weekly_20241122
pip install .Unit tests for zppy-interfacescd ~/ez/zppy-interfaces
conda activate zi_dev_weekly_20241122
pytest tests/global_time_series/test_*.py
# 7 passed in 25.86sUnit tests for zppycd ~/ez/zppy
conda activate zppy_dev_weekly_20241122
pytest tests/test_*.py
# 23 passed in 0.63sIntegration testing for zppy# NOTE: CURRENTLY MANUAL STEP --could potentially pass in arguments to utils.py?
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_zppy_weekly_20241122"
# For get_chyrsalis_expansions:
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm_diags_from_20241029",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi_dev_weekly_20241122",
# Keep this as is: generate_cfgs(unified_testing=False, dry_run=False)
# Run zppy to produce actual results to compare in the integration tests
python tests/integration/utils.py
# That prints:
# CFG FILES HAVE BEEN GENERATED FROM TEMPLATES WITH THESE SETTINGS:
# UNIQUE_ID=test_zppy_weekly_20241122
# unified_testing=False
# diags_environment_commands=source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm_diags_from_20241029
# e3sm_to_cmip_environment_commands=
# environment_commands=
# Reminder: `e3sm_to_cmip_environment_commands=''` => the environment of the `ts` task will be used
# Reminder: `environment_commands=''` => the latest E3SM Unified environment will be used
# NOTE: Created https://github.com/E3SM-Project/zppy/issues/643 to also print out `global_time_series_environment_commands`
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that those 3 zppy runs have finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once those all finish:
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# NOTE: CURRENTLY MANUAL STEP -- check that that zppy run has finished. Try bash `wait command? Just keep running `squeue` every hour until there are no jobs left?
# Once that finishes:
# Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20241122/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20241122/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_zppy_weekly_20241122/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# No errors
# Run integration tests
cd ~/ez/zppy
python -u -m unittest tests/integration/test_*.py
# This needs to be converted to pytest still -- see https://github.com/E3SM-Project/zppy/issues/644That gives: Errorsls tests/integration/test_*.py
# tests/integration/test_bash_generation.py tests/integration/test_defaults.py tests/integration/test_weekly.py
# tests/integration/test_campaign.py tests/integration/test_last_year.py
python -u -m unittest tests/integration/test_bash_generation.py
# FAILED (failures=1)
# The filled-out global-time-series templates have of course changed
# `inclusions/` has been added to relevant paths
# We just need to update the expected results
python -u -m unittest tests/integration/test_campaign.py
# FAILED (failures=5)
# `inclusions/` has been added to relevant paths
# We just need to update the expected results
python -u -m unittest tests/integration/test_defaults.py
# FAILED (failures=1)
# The filled-out global-time-series templates have of course changed
# `inclusions/` has been added to relevant paths
# We just need to update the expected results
python -u -m unittest tests/integration/test_last_year.py
# OKFor Weekly tests, let's look at the image check failures
These issues should be simple to fix. Next week, I'll address them and then re-run the Weekly test suite. Specifically:
|
Beta Was this translation helpful? Give feedback.
-
|
Note that #520 is to automate the |
Beta Was this translation helpful? Give feedback.
-
|
NOTE: The weekly tests' expected results were updated as part of the merging of #650 today (2024-12-18). All tests pass. |
Beta Was this translation helpful? Give feedback.
-
2025-01-14 testing for v3.0.0rc1# Set up branch
cd ~/ez/zppy
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zppy_weekly_20250114 upstream/main
git log
# Last commit, from 1/14: Update PR template (#658)
# Set up and test zppy-interfaces env
cd ~/ez/zppy-interfaces
git status
# Check no file changes will persist when we switch branches.
git fetch upstream main
git checkout -b test_zi_weekly_20250114 upstream/main
git log
# Last commit, from 1/14: Update PR template (#12)
conda clean --all --y
conda env create -f conda/dev.yml -n zi_dev_weekly_20250114
conda activate zi_dev_weekly_20250114
pip install .
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 20.28s
# Set up e3sm_diags env
cd ~/ez/e3sm_diags
git status
# Check no file changes will persist when we switch branches.
git fetch upstream
# https://github.com/E3SM-Project/zppy/pull/651 (zppy using CDAT-migrated Diags) hasn't merged yet
# Because https://github.com/E3SM-Project/e3sm_diags/pull/907 (fixes for CDAT-migrated Diags) hasn't merged yet.
# So, we can't use the latest E3SM Diags.
# https://github.com/E3SM-Project/e3sm_diags/pull/902 (CDAT-migrated Diags) has 42 commits
# The first is:
# CDAT Migration Phase 2: Refactor core utilities and lat_lon set (#677) c090d34ae96dcf43159174ead5c7688385cc70fb
# From 2024-12-04
# The last commit before that was:
# Release 2.12.1 (#893) ca41b0e5d913610c88410928951f1ed11c75663f
# From 2024-11-20
git checkout -b diags_pre_cdat_migration ca41b0e5d913610c88410928951f1ed11c75663f
git log
# Confirmed last commit, from 11/20: Release 2.12.1 (#893)
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm_diags_pre_cdat_migration
conda activate e3sm_diags_pre_cdat_migration
pip install .
# Set up zppy env
cd ~/ez/zppy
conda clean --all --y
conda env create -f conda/dev.yml -n zppy_dev_weekly_20250114
conda activate zppy_dev_weekly_20250114
pip install .
# Unit tests for zppy
pytest tests/test_*.py
# 23 passed, 1 warning in 0.40s
# (DeprecationWarning: invalid escape sequence \.)
# Integration testing for zppy
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_zppy_weekly_20250114"
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm_diags_pre_cdat_migration",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi_dev_weekly_20250114",
# Keep this as-is: generate_cfgs(unified_testing=False, dry_run=False)
# Run zppy to produce actual results to compare in the integration tests
python tests/integration/utils.py
# That prints:
# CFG FILES HAVE BEEN GENERATED FROM TEMPLATES WITH THESE SETTINGS:
# UNIQUE_ID=test_zppy_weekly_20250114
# unified_testing=False
# diags_environment_commands=source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm_diags_pre_cdat_migration
# global_time_series_environment_commands=source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi_dev_weekly_20250114
# environment_commands=
# Reminder: `environment_commands=''` => the latest E3SM Unified environment will be used
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# Wait to finish
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# Wait to finish
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20250114/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20250114/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_zppy_weekly_20250114/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# No errors
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 1.30s
# Diffs are expected
pytest tests/integration/test_campaign.py
# 3 failed, 3 passed in 2.33s
# Diffs are expected
pytest tests/integration/test_defaults.py
# 1 failed in 1.25s
# Diffs are expected
pytest tests/integration/test_last_year.py
# 1 passed in 0.27s
pytest tests/integration/test_weekly.py
# FAILED tests/integration/test_weekly.py::test_comprehensive_v2_images - AssertionError
# FAILED tests/integration/test_weekly.py::test_comprehensive_v3_images - AssertionError
# 2 failed, 3 passed in 1290.08s (0:21:30)
# test_comprehensive_v2_images
# Total number of images checked: 4376
# Missing images: 2
# e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/qbo/QBO-model-vs-model/qbo_diags.png
# e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1982-1983/qbo/QBO-ERA-Interim/qbo_diags.png
# test_comprehensive_v3_images
# Total number of images checked: 5216
# Missing images: 2
# e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/qbo/QBO-model-vs-model/qbo_diags.png
# e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1987-1988/qbo/QBO-ERA-Interim/qbo_diags.png
# Mismatched images: 14
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_1.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_3.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_3.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_0.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_2.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_1.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_original.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_0.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_0.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_2.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_1.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_original.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_2.png
# global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_3.png
# Let's look at the image diffs
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v2_www/test_zppy_weekly_20250114/v2.LR.historical_0201/image_check_failures_comprehensive_v2/ is empty
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v3_www/test_zppy_weekly_20250114/v3.LR.historical_0051/image_check_failures_comprehensive_v3/global_time_series/ has diffs
# The diffs appear to be due to the unit changes introduced by zppy_interfaces/global_time_series/zppy_land_fields.csv
# What about the missing E3SM images though?
# QBO isn't showing up on:
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v2_www/test_zppy_weekly_20250114/v2.LR.historical_0201/e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/viewer/
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v2_www/test_zppy_weekly_20250114/v2.LR.historical_0201/e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1982-1983/viewer/
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v3_www/test_zppy_weekly_20250114/v3.LR.historical_0051/e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1987-1988/viewer/
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.forsyth2/zppy_weekly_comprehensive_v3_www/test_zppy_weekly_20250114/v3.LR.historical_0051/e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_zppy_weekly_20250114/v3.LR.historical_0051/post/scripts
grep -in qbo e3sm_diags_atm_monthly_180x360_aave_model_vs_obs_1987-1988.o665758
# 1642:2025-01-14 15:38:56,458 [ERROR]: core_parameter.py(_run_diag:269) >> Error in e3sm_diags.driver.qbo_driver
# 1646: File "/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/e3sm_diags_pre_cdat_migration/lib/python3.10/site-packages/e3sm_diags/driver/qbo_driver.py", line 261, in run_diag
# 1648: File "/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/e3sm_diags_pre_cdat_migration/lib/python3.10/site-packages/e3sm_diags/driver/qbo_driver.py", line 170, in get_psd_from_wavelet
# Traceback (most recent call last):
# File "/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/e3sm_diags_pre_cdat_migration/lib/python3.10/site-packages/e3sm_diags/parameter/core_parameter.py", line 266, in _run_diag
# single_result = module.run_diag(self)
# File "/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/e3sm_diags_pre_cdat_migration/lib/python3.10/site-packages/e3sm_diags/driver/qbo_driver.py", line 261, in run_diag
# test["wave_period"], test_wavelet = get_psd_from_wavelet(test_detrended_data)
# File "/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/e3sm_diags_pre_cdat_migration/lib/python3.10/site-packages/e3sm_diags/driver/qbo_driver.py", line 170, in get_psd_from_wavelet
# cwtmatr = scipy.signal.cwt(data, scipy.signal.morlet2, widths=widths, w=deg)
# AttributeError: module 'scipy.signal' has no attribute 'cwt'
grep -in cwt e3sm_diags*.o*
# e3sm_diags_atm_monthly_180x360_aave_model_vs_obs_1987-1988.o665758:1649: cwtmatr = scipy.signal.cwt(data, scipy.signal.morlet2, widths=widths, w=deg)
# e3sm_diags_atm_monthly_180x360_aave_model_vs_obs_1987-1988.o665758:1650:AttributeError: module 'scipy.signal' has no attribute 'cwt'
# e3sm_diags_atm_monthly_180x360_aave_mvm_model_vs_model_1987-1988_vs_1985-1986.o665759:2297: cwtmatr = scipy.signal.cwt(data, scipy.signal.morlet2, widths=widths, w=deg)
# e3sm_diags_atm_monthly_180x360_aave_mvm_model_vs_model_1987-1988_vs_1985-1986.o665759:2298:AttributeError: module 'scipy.signal' has no attribute 'cwt'
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_zppy_weekly_20250114/v2.LR.historical_0201/post/scripts
grep -in cwt e3sm_diags*.o*
# e3sm_diags_atm_monthly_180x360_aave_model_vs_obs_1982-1983.o665789:1790: cwtmatr = scipy.signal.cwt(data, scipy.signal.morlet2, widths=widths, w=deg)
# e3sm_diags_atm_monthly_180x360_aave_model_vs_obs_1982-1983.o665789:1791:AttributeError: module 'scipy.signal' has no attribute 'cwt'
# e3sm_diags_atm_monthly_180x360_aave_mvm_model_vs_model_1980-1981_vs_1980-1981.o665790:3483: cwtmatr = scipy.signal.cwt(data, scipy.signal.morlet2, widths=widths, w=deg)
# e3sm_diags_atm_monthly_180x360_aave_mvm_model_vs_model_1980-1981_vs_1980-1981.o665790:3484:AttributeError: module 'scipy.signal' has no attribute 'cwt' |
Beta Was this translation helpful? Give feedback.
-
Thread for E3SM Unified 1.11.0 testingWhere is testing data currently?
2025-01-21source /lcrc/soft/climate/e3sm-unified/test_e3sm_unified_1.11.0rc2_chrysalis.sh
conda list
conda list | grep -E "zppy-interfaces|zppy|zstash"
# zppy 3.0.0rc1 pyh51c0ceb_0 conda-forge/label/zppy_dev
# zppy-interfaces 0.1.1 pyhd8ed1ab_0 conda-forge
# zstash 1.4.4rc1 pyh9caca29_0 conda-forge/label/zstash_devTo do:
|
Beta Was this translation helpful? Give feedback.
-
Testing zppy using Unified 1.11.0rc13 on Chrysalis2025-03-13 I ran the full intergration test suite on Chrysalis. (At the moment, I still have 12 jobs waiting on Perlmutter and while Compy has finished running jobs I haven't had a chance to run All unit tests pass. Excluding To see all the steps I did to test on Chrysalis, expand this section: Testing steps# 2025-03-13
source /lcrc/soft/climate/e3sm-unified/test_e3sm_unified_1.11.0rc13_chrysalis.sh
conda list | grep -E "e3sm_diags|zppy-interfaces|zppy"
# e3sm_diags 3.0.0rc4 pyh2ed6286_0 conda-forge/label/e3sm_diags_dev
# zppy 3.0.0rc5 pyh51c0ceb_0 conda-forge/label/zppy_dev
# zppy-interfaces 0.1.1 pyhd8ed1ab_0 conda-forge
# Unified rc10->rc13 ===> e3sm_diags: rc3->rc4, zppy: rc3->rc5, zppy-interfaces: no change
# The last important commit for...
# https://github.com/E3SM-Project/e3sm_diags/commits/main -- 0 important merges since 3.0.0rc4
# https://github.com/E3SM-Project/zppy-interfaces/commits/main -- 0 important merges since 0.1.1
# https://github.com/E3SM-Project/zppy/commits/main -- 0 important merges since 3.0.0rc5
cd ~/ez/zppy
git status
# Make sure no changes will persist
git fetch upstream main
git checkout -b test_unified_rc13_20250313 upstream/main
git log
# Latest commit, from 3/11: Bump to 3.0.0rc5 (#691)
# Unit tests for zppy
pytest tests/test_*.py
# 25 passed, 1 warning in 0.81s
# /gpfs/fs1/home/ac.forsyth2/ez/zppy/zppy/utils.py:191: DeprecationWarning: invalid escape sequence '\.'
# Edit tests/integration/utils.py
# UNIQUE_ID = "test_unified_rc13_20250313"
# "diags_environment_commands": "",
# "environment_commands_test": "source /lcrc/soft/climate/e3sm-unified/test_e3sm_unified_1.11.0rc13_chrysalis.sh",
# "global_time_series_environment_commands": "",
# Set: generate_cfgs(unified_testing=True, dry_run=False)
# Keep:
# "user_input_v2": "/lcrc/group/e3sm/ac.forsyth2/",
# "user_input_v3": "/lcrc/group/e3sm2/ac.wlin/",
python tests/integration/utils.py
git add -A # To make it clear if there are later changes
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_unified_rc13_20250313/v3.LR.historical_0051/post/scripts/
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_unified_rc13_20250313/v2.LR.historical_0201/post/scripts/
grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_unified_rc13_20250313/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# No errors
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 0/1 passed (diffs are expected)
pytest tests/integration/test_campaign.py
# 1/6 passed (diffs are expected)
pytest tests/integration/test_defaults.py
# 0/1 passed (diffs are expected)
pytest tests/integration/test_last_year.py
# 1/1 passed
pytest tests/integration/test_weekly.py
# 0/5 passed
cd /home/ac.forsyth2/ez/zppy/tests/integration
# Copy over image_diff_grid.py from https://github.com/E3SM-Project/zppy/pull/685
# Make some edits for use here
time python image_diff_grid.py
# real 13m42.580s
# Needed to make some more edits to get all PDFs to actually work
# Each PDF needs to be created as a distinct python call
# Otherwise, only the first generates
# run `python image_diff_grid.py` for each set of diffs individuallyThe rest of this message will concern Image counts, table form:
Challenges to easily displaying test outputPlease read this section if you are curious about the difficulties in compiling test results and displaying them in any sort of human-friendly manner. Challenges compiling and displaying resultsFirst and foremost, we usually do not have so many diffs to work through. I think because the CDAT migration touched every part of In any case, the sheer number of diffs has made combing through test results quite time consuming. I have tried to automate some of this, such as #685 for creating high-level overviews of diffs. More automation is possible, but there is always the tradeoff between just getting something done for this particular round of testing vs producing something more robust that can be used in the future many times.
e3sm_diags@chengzhuzhang @tomvothecoder Please review this section. If the diffs are expected, I can go ahead and update the expected results for e3sm_diagsMissing imagestest_comprehensive_v2_e3sm_diags_images: 81 missing
test_comprehensive_v3_e3sm_diags_images: 102 missing
Was Mismatched imagesI did a very tedious visual inspection of the diff PDFs to try to make it as clear as possible what the diffs are. See tables below. These use the image diff grid with 5 rows per page, but thanks to the Explanation of column titles (columns 5, 7, 8, 9 are probably most relevant for you two, columns 2/3/4/6 were really for checking my arithmetic)
test_comprehensive_v2_e3sm_diags_images: 979 mismatched images -- image diff grid
Notice the cumulative diff above, 980, is 1 more than the expected number of mismatched images, 979. test_comprehensive_v3_e3sm_diags_images: 1069 mismatched images -- image diff grid
Notice the cumulative diff above, 1070, is 1 more than the expected number of mismatched images, 1069. test_bundles_e3sm_diags_images: 492 mismatched images
Notice the cumulative diff above, 495, is 3 more than the expected number of mismatched images, 492. mpas_analysis@xylar Please review this section. If the diffs are expected, I can go ahead and update the expected results for I know you already reviewed the mpas_analysisMissing imagestest_comprehensive_v2_mpas_analysis_images: 722 missing (compare to 708 in the
test_comprehensive_v3_mpas_analysis_images: 740 missing (compare to 726 in the
In both cases, it looks like 14 images went missing since your Mismatched imagestest_comprehensive_v2_mpas_analysis_images: 41 mismatched images -- image diff grid
test_comprehensive_v3_mpas_analysis_images: 41 mismatched images -- image diff grid
ilamb@chengzhuzhang Please review this section. If the diffs are expected, I can go ahead and update the expected results for ilambMissing imagesNo missing images Mismatched imagesAs in test_comprehensive_v2_ilamb_images: 20 mismatched images -- image_diff_grid
test_comprehensive_v3_ilamb_images: 24 mismatched images -- image_diff_grid
test_bundles_ilamb_images: 16 mismatched images -- image_diff_grid
|
Beta Was this translation helpful? Give feedback.
-
Chrysalis test 2025-04-01: latest
|
| test name | missing | mismatched | correct |
|---|---|---|---|
| test_comprehensive_v2_e3sm_diags_images | 0 | 46 | 2875 |
| test_comprehensive_v2_global_time_series_images | 0 | 5 | 1 |
| test_comprehensive_v3_e3sm_diags_images | 0 | 54 | 3696 |
| test_comprehensive_v3_global_time_series_images | 0 | 14 | 1 |
| test_bundles_global_time_series_images | 0 | 2 | 1 |
Most of the image mismatches seem ok to be. There are a couple concerns, which might not be big deals:
test_comprehensive_v3_e3sm_diags_images: The OMEGA diffs are decently noticeable. Perhaps expected?test_comprehensive_v3_global_time_series_images: lnd plots are plotted in the PDF in the order they appear in the lnd csv, rather than the order they are listed in the cfg.
Details of Chrysalis mismatched images
test_comprehensive_v2_e3sm_diags_images mismatched -- none of the diffs appear to be important. Must be a result of e3sm_diags/driver/utils/regrid.py updates in https://github.com/E3SM-Project/e3sm_diags/pull/958/files or https://github.com/E3SM-Project/e3sm_diags/pull/959/files. Note the stats do change by small amounts for some of the diffs.
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-T-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-SST-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-U-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-T-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFHT-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFHT-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-SST-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMXAV-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-U-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-U-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFHT-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-U-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-U-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-SST-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREF_range-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-T-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMXAV-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMXAV-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREF_range-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREF_range-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFHT-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-OMEGA-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TAUXY-JJA-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-OMEGA-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFHT-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-T-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREF_range-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-SST-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMXAV-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMNAV-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMNAV-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-OMEGA-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-SST-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-OMEGA-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TAUXY-ANN-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-T-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMXAV-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMNAV-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREF_range-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-OMEGA-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMNAV-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TREFMNAV-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1980-1981/lat_lon/model_vs_model/v2.LR.historical_0201-TAUXY-SON-ocean.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1982-1983/lat_lon/CRU_IPCC/CRU-TREFHT-DJF-land_60S90N.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1982-1983/lat_lon/CRU_IPCC/CRU-TREFHT-JJA-land_60S90N.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1982-1983/lat_lon/CRU_IPCC/CRU-TREFHT-ANN-land_60S90N.png
test_comprehensive_v2_global_time_series_images mismatched -- looks like the plots just expanded/shrunk horizontally a barely noticeable amount.
global_time_series/global_time_series_1980-1990_results/v2.LR.historical_0201_n_original.png
global_time_series/global_time_series_1980-1990_results/v2.LR.historical_0201_s_lnd.png
global_time_series/global_time_series_1980-1990_results/v2.LR.historical_0201_n_lnd.png
global_time_series/global_time_series_1980-1990_results/v2.LR.historical_0201_glb_lnd.png
global_time_series/global_time_series_1980-1990_results/v2.LR.historical_0201_s_original.png
test_comprehensive_v3_e3sm_diags_images mismatched -- Most of the diffs appear to be unimportant. Must be a result of e3sm_diags/driver/utils/regrid.py updates in https://github.com/E3SM-Project/e3sm_diags/pull/958/files or https://github.com/E3SM-Project/e3sm_diags/pull/959/files. Note the stats do change by small amounts for some of the diffs. The OMEGA diffs are noticeable though.
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/JJA_metrics_taylor_diag_amip.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/JJA_metrics_taylor_diag_historical.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/SON_metrics_taylor_diag_historical.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/ANN_metrics_taylor_diag_amip.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/ANN_metrics_taylor_diag_historical.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/viewer/taylor-diagram-data/SON_metrics_taylor_diag_amip.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREF_range-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-U-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMNAV-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-SST-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMXAV-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-U-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMXAV-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREF_range-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMNAV-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMNAV-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-SST-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-U-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREF_range-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMNAV-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-U-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TAUXY-SON-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-OMEGA-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-OMEGA-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-SST-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-OMEGA-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TAUXY-JJA-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFHT-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TAUXY-ANN-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMNAV-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-T-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-T-850-SON-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFHT-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TAUXY-MAM-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFHT-MAM-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-U-850-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-SST-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-OMEGA-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TAUXY-DJF-ocean.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFHT-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREF_range-DJF-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-SST-ANN-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMXAV-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMXAV-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-OMEGA-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-T-850-JJA-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-T-850-MAM-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFMXAV-JJA-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREF_range-SON-land.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-T-850-DJF-global.png
e3sm_diags/atm_monthly_180x360_aave_mvm/model_vs_model_1987-1988/lat_lon/model_vs_model/v3.LR.historical_0051-TREFHT-ANN-land.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1987-1988/lat_lon/CRU_IPCC/CRU-TREFHT-DJF-land_60S90N.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1987-1988/lat_lon/CRU_IPCC/CRU-TREFHT-JJA-land_60S90N.png
e3sm_diags/atm_monthly_180x360_aave/model_vs_obs_1987-1988/lat_lon/CRU_IPCC/CRU-TREFHT-ANN-land_60S90N.png
test_comprehensive_v3_global_time_series_images mismatched -- the plots are printing in a different order, so it is hard to compare any diff. It appears the order now matches the order they appear in the lnd csv. Compare the PDF with the CSV. I suspect this is because of the ("lnd", list(map(lambda v: v.variable_name, requested_variables.vars_land))), change in https://github.com/E3SM-Project/zppy-interfaces/pull/16/files.
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_1.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_3.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_3.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_0.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_2.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_1.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_original.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_0.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_0.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_lnd_2.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_glb_lnd_1.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_original.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_2.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_lnd_3.png
(Also, for reference, note that n_original is an example of displaying blank plots in the multi-plot PDF, because regionally averaged data isn't available.)
test_bundles_global_time_series_images mismatched -- looks like the plots just expanded/shrunk horizontally a barely noticeable amount.
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_s_original.png
global_time_series/global_time_series_1985-1995_results/v3.LR.historical_0051_n_original.png
Beta Was this translation helpful? Give feedback.
-
Unified 1.11.1 official resultsThese machines have "official" test results documented for E3SM Unified 1.11.1:
@chengzhuzhang Did HadISST data on Compy get updated recently? As mentioned in today's meeting, I updated the Chrysalis expected results more recently than the Compy/Perlmutter results. When I ran the zppy tests using Unified 1.11.1, the tests pass on Chrysalis. On Perlmutter, the diffs match up exactly with what Chrysalis had before I updated the expected results for Chrysalis (meaning I'm comfortable updating Perlmutter's expected results as well). Compy however, has had 122 diffs added -- all of which are related to model-vs-obs HadISST. List of diffsThese v2 e3sm_diags diffs were found in Compy run4 but not in Compy run2: These v3 e3sm_diags diffs were found in Compy run4 but not in Compy run2: These bundles e3sm_diags diffs were found in Compy run4 but not in Compy run2: Where to find the actual diffs (subdirectories have actual,diff,expected images):
Code changes (again, these code changes affected neither Chrysalis's results or Perlmutter's results -- only Compy's results):
Comparison scriptfrom typing import Dict, List, Tuple
class Results(object):
def __init__(self, machine:str, run_name: str):
self.machine: str = machine
self.run_name: str = run_name
self.base_path: str = f"/home/ac.forsyth2/ez/unified_1.11.1_results_comparison/{self.run_name}/{self.machine}"
self.file_dict: Dict[str, Dict[str, List[str]]] = {}
def read_files(self, test: str, task: str, file_type: str) -> List[str]:
mismatched_files: List[str] = []
with open(f"{self.base_path}/{test}_{task}_{file_type}.txt", 'r') as file:
for line in file:
line = line.strip()
# For some reason, the list of mismatched files wasn't made for some runs.
# The files can still be listed via `find . -type f -name '*_diff.png'` in the proper directory.
if file_type == "found_diffs":
line = f"{task}/" + line[2:-9] # Remove leading "./" and trailing "_diff.png"
mismatched_files.append(line)
return sorted(mismatched_files)
def filter_list(list1: List[str], list2: List[str]) -> Tuple[List[str], List[str], List[str]]:
left_only: List[str] = []
right_only: List[str] = []
in_both: List[str] = []
for item in list1:
if item in list2:
in_both.append(item)
else:
left_only.append(item)
for item in list2:
if item not in list1:
right_only.append(item)
return left_only, right_only, in_both
def compare():
print("2025-03-13")
print("Run 1: E3SM Unified 1.10.0rc13")
print("zppy: E3SM Unified 1.10.0rc13 (zppy v3.0.0rc4)")
print("zppy-interfaces: E3SM Unified 1.10.0rc13 (zppy-interfaces v0.1.1)")
print("e3sm_diags: E3SM Unified 1.10.0rc13 (e3sm_diags v3.0.0rc4)")
print("Everything else: E3SM Unified 1.10.0rc13")
print("Results:")
print("Chrysalis: Test failures, but they were determined to be ok. Updated expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12495202")
print("Compy: Test failures, but they were determined to be ok. Updated expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12554970")
print("Perlmutter: Test failures, but they were determined to be ok. Updated expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12554970")
print("####################################################################")
print("Then we added the following:")
print("zppy: https://github.com/E3SM-Project/zppy/pull/696, https://github.com/E3SM-Project/zppy/pull/697")
print("zppy-interfaces: https://github.com/E3SM-Project/zppy-interfaces/pull/16")
print("e3sm_diags: https://github.com/E3SM-Project/e3sm_diags/pull/953, https://github.com/E3SM-Project/e3sm_diags/pull/955, https://github.com/E3SM-Project/e3sm_diags/pull/958, https://github.com/E3SM-Project/e3sm_diags/pull/959, https://github.com/E3SM-Project/e3sm_diags/pull/925, https://github.com/E3SM-Project/e3sm_diags/pull/956 (freezing e3sm_diags at v3.0.0)")
print("2025-04-01")
print("Run 2: latest zppy, latest zppy-interfaces, e3sm_diags v3.0.0, Unified 1.11.0rc13 elsewhere")
print("zppy: zppy_dev_20250401")
print("zppy-interfaces: zi_dev_20250401")
print("e3sm_diags: e3sm_diags_dev_v3.0.0")
print("Everything else: E3SM Unified 1.10.0rc13")
print("Results:")
print("Chrysalis: 100 mismatched images introduced in e3sm_diags, 21 mismatched images introduced in global_time_series. Did NOT update expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12694142")
print("Compy: 100 mismatched images introduced in e3sm_diags, 21 mismatched images introduced in global_time_series. Also: 1 missing v3 ILAMB file -- `ilamb/_1985-1988/HydrologyCycle/Permafrost/NSIDC/legend_bias.png` -- which is expected based on the removal of NSIDC in https://github.com/E3SM-Project/zppy/pull/696/files. Did NOT update expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12705106")
print("Perlmutter: Was already on next round of testing by the time nodes were allocated. Didn't continue with this run. Did NOT update expected results. Comment: N/A")
chrysalis_run2 = Results("chrysalis", "run2")
chrysalis_run2.file_dict = {
"v2": {
"e3sm_diags": chrysalis_run2.read_files("v2", "e3sm_diags", "mismatched"),
"global_time_series": chrysalis_run2.read_files("v2", "global_time_series", "mismatched"),
},
"v3": {
"e3sm_diags": chrysalis_run2.read_files("v3", "e3sm_diags", "mismatched"),
"global_time_series": chrysalis_run2.read_files("v3", "global_time_series", "mismatched"),
},
"bundles": {
"global_time_series": chrysalis_run2.read_files("bundles", "global_time_series", "mismatched"),
}
}
assert len(chrysalis_run2.file_dict["v2"]["e3sm_diags"]) == 46
assert len(chrysalis_run2.file_dict["v2"]["global_time_series"]) == 5
assert len(chrysalis_run2.file_dict["v3"]["e3sm_diags"]) == 54
assert len(chrysalis_run2.file_dict["v3"]["global_time_series"]) == 14
assert len(chrysalis_run2.file_dict["bundles"]["global_time_series"]) == 2
compy_run2 = Results("compy", "run2")
compy_run2.file_dict = {
"v2": {
"e3sm_diags": compy_run2.read_files("v2", "e3sm_diags", "mismatched"),
"global_time_series": compy_run2.read_files("v2", "global_time_series", "mismatched"),
},
"v3": {
"e3sm_diags": compy_run2.read_files("v3", "e3sm_diags", "mismatched"),
"global_time_series": compy_run2.read_files("v3", "global_time_series", "mismatched"),
},
"bundles": {
"global_time_series": compy_run2.read_files("bundles", "global_time_series", "mismatched"),
}
}
assert len(compy_run2.file_dict["v2"]["e3sm_diags"]) == 46
assert len(compy_run2.file_dict["v2"]["global_time_series"]) == 5
assert len(compy_run2.file_dict["v3"]["e3sm_diags"]) == 54
assert len(compy_run2.file_dict["v3"]["global_time_series"]) == 14
assert len(compy_run2.file_dict["bundles"]["global_time_series"]) == 2
assert chrysalis_run2.file_dict["v2"]["e3sm_diags"] == compy_run2.file_dict["v2"]["e3sm_diags"]
assert chrysalis_run2.file_dict["v2"]["global_time_series"] == compy_run2.file_dict["v2"]["global_time_series"]
assert chrysalis_run2.file_dict["v3"]["e3sm_diags"] == compy_run2.file_dict["v3"]["e3sm_diags"]
assert chrysalis_run2.file_dict["v3"]["global_time_series"] == compy_run2.file_dict["v3"]["global_time_series"]
assert chrysalis_run2.file_dict["bundles"]["global_time_series"] == compy_run2.file_dict["bundles"]["global_time_series"]
print("Chrysalis run2 and Compy run2 had the exact same set of mismatched files.")
print("Compy has a descrepancy: Missing `ilamb/_1985-1988/HydrologyCycle/Permafrost/NSIDC/legend_bias.png`. However, this is expected based on the code changes.")
print("####################################################################")
print("Then we added the following:")
print("zppy: https://github.com/E3SM-Project/zppy/pull/699, https://github.com/E3SM-Project/zppy/pull/700, https://github.com/E3SM-Project/zppy/pull/702 + manaual fix to 702 in bash script")
print("zppy-interfaces: N/A")
print("e3sm_diags: N/A")
print("2025-04-03")
print("Run 3: NEW latest zppy, latest zppy-interfaces, e3sm_diags v3.0.0, Unified 1.11.0rc13 elsewhere")
print("zppy: zppy_dev_20250403")
print("zppy-interfaces: zi_dev_20250403")
print("e3sm_diags: e3sm_diags_dev_v3.0.0")
print("Everything else: E3SM Unified 1.10.0rc13")
print("Results:")
print("Chrysalis: No changes. Updated expected results. Comment: https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-12727813")
print("Compy: Didn't run")
print("Perlmutter: Didn't run")
chrysalis_run3 = Results("chrysalis", "run3")
chrysalis_run3.file_dict = {
"v2": {
"e3sm_diags": chrysalis_run3.read_files("v2", "e3sm_diags", "found_diffs"),
"global_time_series": chrysalis_run3.read_files("v2", "global_time_series", "found_diffs"),
},
"v3": {
"e3sm_diags": chrysalis_run3.read_files("v3", "e3sm_diags", "found_diffs"),
"global_time_series": chrysalis_run3.read_files("v3", "global_time_series", "found_diffs"),
},
"bundles": {
"global_time_series": chrysalis_run3.read_files("bundles", "global_time_series", "found_diffs"),
}
}
assert len(chrysalis_run3.file_dict["v2"]["e3sm_diags"]) == 46
assert len(chrysalis_run3.file_dict["v2"]["global_time_series"]) == 5
assert len(chrysalis_run3.file_dict["v3"]["e3sm_diags"]) == 54
assert len(chrysalis_run3.file_dict["v3"]["global_time_series"]) == 14
assert len(chrysalis_run3.file_dict["bundles"]["global_time_series"]) == 2
assert chrysalis_run3.file_dict["v2"]["e3sm_diags"] == chrysalis_run2.file_dict["v2"]["e3sm_diags"]
assert chrysalis_run3.file_dict["v2"]["global_time_series"] == chrysalis_run2.file_dict["v2"]["global_time_series"]
assert chrysalis_run3.file_dict["v3"]["e3sm_diags"] == chrysalis_run2.file_dict["v3"]["e3sm_diags"]
assert chrysalis_run3.file_dict["v3"]["global_time_series"] == chrysalis_run2.file_dict["v3"]["global_time_series"]
assert chrysalis_run3.file_dict["bundles"]["global_time_series"] == chrysalis_run2.file_dict["bundles"]["global_time_series"]
print("Chrysalis run2 and Chrysalis run3 had the exact same set of mismatched files.")
print("####################################################################")
print("Then we added the following:")
print("zppy: https://github.com/E3SM-Project/zppy/pull/694, https://github.com/E3SM-Project/zppy/pull/705, https://github.com/E3SM-Project/zppy/pull/701, https://github.com/E3SM-Project/zppy/pull/706, https://github.com/E3SM-Project/zppy/pull/707 (freezing zppy at v3.0.0)")
print("zppy-interfaces: https://github.com/E3SM-Project/zppy-interfaces/pull/24 (freezing zppy-interfaces at v0.1.2)")
print("e3sm_diags: N/A")
print("2025-04-15")
print("Run 4: E3SM Unified 1.11.1")
print("zppy: E3SM Unified 1.11.1 (zppy v3.0.0)")
print("zppy-interfaces: E3SM Unified 1.11.1 (zppy-interfaces v0.1.2)")
print("e3sm_diags: E3SM Unified 1.11.1 (e3sm_diags v3.0.0)")
print("Everything else: E3SM Unified 1.11.1")
print("Results:")
print("Chrysalis: No test failures. Made copy of expected results, but didn't update them (no update needed, because there were no diffs)")
print("Compy: 122 mismatched images introduced in e3sm_diags (100 => 222). All changes are due to HadISST. Did NOT YET update expected results.")
print("Perlmutter: 100 mismatched images introduced in e3sm_diags, 21 mismatched images introduced in global_time_series. Did NOT YET update expected results.")
compy_run4 = Results("compy", "run4")
compy_run4.file_dict = {
"v2": {
"e3sm_diags": compy_run4.read_files("v2", "e3sm_diags", "mismatched"),
"global_time_series": compy_run4.read_files("v2", "global_time_series", "mismatched"),
},
"v3": {
"e3sm_diags": compy_run4.read_files("v3", "e3sm_diags", "mismatched"),
"global_time_series": compy_run4.read_files("v3", "global_time_series", "mismatched"),
},
"bundles": {
"e3sm_diags": compy_run4.read_files("bundles", "e3sm_diags", "mismatched"),
"global_time_series": compy_run4.read_files("bundles", "global_time_series", "mismatched"),
}
}
assert len(compy_run4.file_dict["v2"]["e3sm_diags"]) == 77
assert len(compy_run4.file_dict["v2"]["global_time_series"]) == 5
assert len(compy_run4.file_dict["v3"]["e3sm_diags"]) == 85
assert len(compy_run4.file_dict["v3"]["global_time_series"]) == 14
assert len(compy_run4.file_dict["bundles"]["e3sm_diags"]) == 60
assert len(compy_run4.file_dict["bundles"]["global_time_series"]) == 2
print("Compy run4 has a discrepancy: 122 more diffs in e3sm_diags than found in compy run2.")
print("Both Compy run4 and Compy run2 have the missing ILAMB file: `ilamb/_1985-1988/HydrologyCycle/Permafrost/NSIDC/legend_bias.png`. Again, this is expected based on the code changes.")
assert compy_run4.file_dict["v2"]["global_time_series"] == compy_run2.file_dict["v2"]["global_time_series"]
assert compy_run4.file_dict["v3"]["global_time_series"] == compy_run2.file_dict["v3"]["global_time_series"]
assert compy_run4.file_dict["bundles"]["global_time_series"] == compy_run2.file_dict["bundles"]["global_time_series"]
left_only, right_only, in_both = filter_list(compy_run4.file_dict["v2"]["e3sm_diags"], compy_run2.file_dict["v2"]["e3sm_diags"])
assert len(right_only) == 0
assert len(in_both) == 46
assert len(left_only) == 31
print("These v2 e3sm_diags diffs were found in Compy run4 but not in Compy run2:")
for file in left_only:
print(file)
left_only, right_only, in_both = filter_list(compy_run4.file_dict["v3"]["e3sm_diags"], compy_run2.file_dict["v3"]["e3sm_diags"])
assert len(right_only) == 0
assert len(in_both) == 54
assert len(left_only) == 31
print("These v3 e3sm_diags diffs were found in Compy run4 but not in Compy run2:")
for file in left_only:
print(file)
assert len(compy_run4.file_dict["bundles"]["e3sm_diags"]) == 60
print("These bundles e3sm_diags diffs were found in Compy run4 but not in Compy run2:")
for file in compy_run4.file_dict["bundles"]["e3sm_diags"]:
print(file)
perlmutter_run4 = Results("perlmutter", "run4")
perlmutter_run4.file_dict = {
"v2": {
"e3sm_diags": perlmutter_run4.read_files("v2", "e3sm_diags", "mismatched"),
"global_time_series": perlmutter_run4.read_files("v2", "global_time_series", "mismatched"),
},
"v3": {
"e3sm_diags": perlmutter_run4.read_files("v3", "e3sm_diags", "mismatched"),
"global_time_series": perlmutter_run4.read_files("v3", "global_time_series", "mismatched"),
},
"bundles": {
"global_time_series": perlmutter_run4.read_files("bundles", "global_time_series", "mismatched"),
}
}
assert len(perlmutter_run4.file_dict["v2"]["e3sm_diags"]) == 46
assert len(perlmutter_run4.file_dict["v2"]["global_time_series"]) == 5
assert len(perlmutter_run4.file_dict["v3"]["e3sm_diags"]) == 54
assert len(perlmutter_run4.file_dict["v3"]["global_time_series"]) == 14
assert len(perlmutter_run4.file_dict["bundles"]["global_time_series"]) == 2
assert perlmutter_run4.file_dict["v2"]["e3sm_diags"] == chrysalis_run2.file_dict["v2"]["e3sm_diags"]
assert perlmutter_run4.file_dict["v2"]["global_time_series"] == chrysalis_run2.file_dict["v2"]["global_time_series"]
assert perlmutter_run4.file_dict["v3"]["e3sm_diags"] == chrysalis_run2.file_dict["v3"]["e3sm_diags"]
assert perlmutter_run4.file_dict["v3"]["global_time_series"] == chrysalis_run2.file_dict["v3"]["global_time_series"]
assert perlmutter_run4.file_dict["bundles"]["global_time_series"] == chrysalis_run2.file_dict["bundles"]["global_time_series"]
print("Chrysalis run2 and Perlmutter run4 had the exact same set of mismatched files.")
if __name__ == "__main__":
compare() |
Beta Was this translation helpful? Give feedback.
-
2025-06-13 threadUsing the commits of #720 to test the latest image checker functionality (diff bounding boxes on actual & expected, added legacy tests). In particular, we want to test the latest # 1. Set up environments
lcrc_conda # Function to set up conda locally
# 1a. e3sm_diags
cd ~/ez/e3sm_diags
git fetch upstream main
git status
# Check for uncommitted changes
git checkout main
git reset --hard upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/e3sm_diags/commits/main
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250613
conda activate e3sm-diags-main-20250613
pip install .
# 1b. zppy-interfaces
# Latest commit on https://github.com/E3SM-Project/zppy-interfaces/commits/main
# is "Bump to 0.1.2", so we can just use Unified.
# 1c. zppy
# We're going to use the commits from #720 to try out new features of the image checker
# (diff bounding boxes on actual & expected, added legacy tests)
cd ~/ez/zppy
git status
# Check for uncommitted changes
git fetch upstream image-checker # branch for #720
git checkout -b test-weekly-20250613 upstream/image-checker
git log
# The 3 commits of the image-checker PR #720
# The provenance cfg PR #713
# (We saw in testing of the global_time_series split PR #722 that this will require updates to the expected results)
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-test-weekly-20250613
conda activate zppy-test-weekly-20250613
pip install .
# 2. Run unit tests
pytest tests/test_*.py
# 25 passed, 1 warning in 0.30s
# /gpfs/fs1/home/ac.forsyth2/ez/zppy/zppy/utils.py:191: DeprecationWarning: invalid escape sequence \.
# 3. Set up zppy runs
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_weekly_20250613"
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250613",
# "global_time_series_environment_commands": "source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh",
python tests/integration/utils.py
# 4. Launch zppy jobs to produce actual images for image checker test
# 4a. Legacy tests
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# 4b. Regular tests
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# TODO ########################################################################
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
ls tests/integration/test_*.py # to see what tests are available to run
# 5. Run zppy integration tests
pytest tests/integration/test_bash_generation.py
pytest tests/integration/test_bundles.py
pytest tests/integration/test_campaign.py
pytest tests/integration/test_defaults.py
pytest tests/integration/test_last_year.py
pytest tests/integration/test_bundles.py
pytest tests/integration/test_images.py |
Beta Was this translation helpful? Give feedback.
-
2025-07-10 threadSetup# 1. Set up environments
lcrc_conda # Function to set up conda locally
# 1a. e3sm_diags
cd ~/ez/e3sm_diags
git fetch upstream main
git status
# Check for uncommitted changes
git checkout main
git reset --hard upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/e3sm_diags/commits/main
# From 7/9: Fix xcdat bounds dim issue with v0.9.1 (#992)
# Matches, good
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250710
conda activate e3sm-diags-main-20250710
pip install .
# 1b. zppy-interfaces
cd ~/ez/zppy-interfaces
git fetch upstream main
git status
# Check for uncommitted changes
git checkout main
git reset --hard upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
# From 7/1: Update to latest conda-incubator/setup-miniconda
# Matches, good
conda clean --all --y
conda env create -f conda/dev.yml -n zi-main-20250710
conda activate zi-main-20250710
pip install .
pytest tests/unit/global_time_series/test_global_time_series.py
# ImportError: cannot import name 'construct_generic_variables' from 'zppy_interfaces.global_time_series.coupled_global' (/gpfs/fs1/home/ac.forsyth2/miniforge3/envs/zi-main-20250710/lib/python3.12/site-packages/zppy_interfaces/global_time_series/coupled_global/__init__.py)
# 1c. zppy
cd ~/ez/zppy
git status
# Check for uncommitted changes
git fetch upstream main
git checkout -b test-weekly-20250710 upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/zppy/commits/main
# From 6/18: Merge pull request #720 from E3SM-Project/image-checker
# Matches, good
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-test-weekly-20250710
conda activate zppy-test-weekly-20250710
pip install .
# 2. Run unit tests
pytest tests/test_*.py
# 25 passed in 0.49s
# Because of the zppy-interfaces building issues, we'll just use Unified.
# There has only been one commit (only affecting GitHub workflows) on zppy-interfaces.
# Build issues: https://github.com/E3SM-Project/zppy-interfaces/issues/30#issuecomment-3058378312
# 3. Set up zppy runs
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_weekly_20250710"
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250710",
# "global_time_series_environment_commands": "source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh",
python tests/integration/utils.py
# 4. Launch zppy jobs to produce actual images for image checker test
# 4a. Legacy tests
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# 4b. Regular tests
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# 4c. Run the second part of the bundles cfgs
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# 4d. Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/test_weekly_20250710/v2.LR.historical_0201/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_weekly_20250710/v2.LR.historical_0201/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_weekly_20250710/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
# 5. Run zppy integration tests
cd ~/ez/zppy
ls tests/integration/test_*.py
# 5a. The simpler integration tests
# Just need to add provenance cfg to expected files.
# Actually, the provenance file is test-specific:
# provenance.20250617_005905_907190.cfg
# So, that will need to be accounted for in the expected file update...
pytest tests/integration/test_bash_generation.py
# 1 failed in 1.95s
pytest tests/integration/test_campaign.py
# 6 failed in 3.04s
pytest tests/integration/test_defaults.py
# 1 failed in 1.18s
pytest tests/integration/test_last_year.py
# 1 failed in 0.34s
# 5b. The integration tests that rely on the results from step 4
pytest tests/integration/test_bundles.py
# 2 passed in 0.17s
pytest tests/integration/test_images.py
# 1 failed in 3152.74s (0:52:32)
cat test_images_summary.mdSummary of test resultsDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
AnalysisI didn't update the expected results after the last run, probably because we determined ls -l /lcrc/group/e3sm/public_html/zppy_test_resources
# The legacy expected files were updated on June 5, because that was me adding them in the first place
# Everything else was last updated on April 4, around the time of the Unified release
ls -lt /lcrc/group/e3sm/public_html/zppy_test_resources_previous/
# Last update was "expected_results_for_unified_1.11.1" on 4/15
# Then, "expected_results_until_20250403" on 4/4So, diff comparison from the last run:
Diff of the diffscd /lcrc/group/e3sm/public_html/diagnostic_output/ac.forsyth2/
diff zppy_weekly_comprehensive_v2_www/test_weekly_20250613/v2.LR.historical_0201/image_check_failures_comprehensive_v2/e3sm_diags/mismatched_images.txt zppy_weekly_comprehensive_v2_www/test_weekly_20250710/v2.LR.historical_0201/image_check_failures_comprehensive_v2/e3sm_diags/mismatched_images.txt diff zppy_weekly_comprehensive_v3_www/test_weekly_20250613/v3.LR.historical_0051/image_check_failures_comprehensive_v3/e3sm_diags/mismatched_images.txt zppy_weekly_comprehensive_v3_www/test_weekly_20250710/v3.LR.historical_0051/image_check_failures_comprehensive_v3/e3sm_diags/mismatched_images.txtdiff zppy_weekly_bundles_www/test_weekly_20250613/v3.LR.historical_0051/image_check_failures_bundles/e3sm_diags/mismatched_images.txt zppy_weekly_bundles_www/test_weekly_20250710/v3.LR.historical_0051/image_check_failures_bundles/e3sm_diags/mismatched_images.txt@chengzhuzhang It looks like the missing images issue has been resolved. The number of mismatches has increased though, all attributable to |
Beta Was this translation helpful? Give feedback.
-
|
@forsyth2 thanks Ryan! I think the results are as expected. ENSO figures are no longer missing. And the difference is just because the position change of labels needed for proper layout of pdf to be saved. |
Beta Was this translation helpful? Give feedback.
-
2025-08-22 threadSetup# 1. Set up environments
lcrc_conda # Function to set up conda locally
# 1a. e3sm_diags
cd ~/ez/e3sm_diags
git fetch upstream main
git status
# Check for uncommitted changes
git checkout main
git reset --hard upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/e3sm_diags/commits/main
# From 7/15: Bump to 3.0.1 (#993)
# Matches, good
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250822
conda activate e3sm-diags-main-20250822
python -m pip install .
# 1b. zppy-interfaces
cd ~/ez/zppy-interfaces
git fetch upstream main
git status
# Check for uncommitted changes
git checkout main
git reset --hard upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
# From 8/11: Add missing packages to dependencies (#32)
# Matches, good
conda clean --all --y
conda env create -f conda/dev.yml -n zi-main-20250822
conda activate zi-main-20250822
python -m pip install .
pytest tests/unit/global_time_series/test_global_time_series.py
# 10 passed in 26.53s
# 1c. zppy
cd ~/ez/zppy
git status
# Check for uncommitted changes
git fetch upstream main
git checkout -b test-weekly-20250822 upstream/main
git log
# Check that latest commit matches https://github.com/E3SM-Project/zppy/commits/main
# From 8/18: fix e3sm_diags year looping for land model vs model (#731)
# Matches, good
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-test-weekly-20250822
conda activate zppy-test-weekly-20250822
python -m pip install .
# 2. Run unit tests
pytest tests/test_*.py
# 25 passed, 1 warning in 0.42s
# /gpfs/fs1/home/ac.forsyth2/ez/zppy/zppy/utils.py:191: DeprecationWarning: invalid escape sequence \.
# 3. Set up zppy runs
# Edit tests/integration/utils.py:
# UNIQUE_ID = "test_weekly_20250822"
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250822",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20250822",
python tests/integration/utils.py
# 4. Launch zppy jobs to produce actual images for image checker test
# 4a. Legacy tests
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# 4b. Regular tests
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 1st part of bundles cfg
# 4c. Run the second part of the bundles cfgs
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg # Runs 2nd part of bundles cfg
# 4d. Check output
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/test_weekly_20250822/v2.LR.historical_0201/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/test_weekly_20250822/v2.LR.historical_0201/post/scripts/ && grep -v "OK" *status
# No errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/test_weekly_20250822/v3.LR.historical_0051/post/scripts/ && grep -v "OK" *status
# No errors
# 5. Run zppy integration tests
cd ~/ez/zppy
ls tests/integration/test_*.py
# 5a. The simpler integration tests
# Just need to add provenance cfg to expected files.
# Actually, the provenance file is test-specific:
# provenance.20250617_005905_907190.cfg
# So, that will need to be accounted for in the expected file update...
pytest tests/integration/test_bash_generation.py
# 1 failed
# Other failures appear expected based on changes in #723
pytest tests/integration/test_campaign.py
# 6 failed
pytest tests/integration/test_defaults.py
# 1 failed
pytest tests/integration/test_last_year.py
# 1 failed
# 5b. The integration tests that rely on the results from step 4
pytest tests/integration/test_bundles.py
# 2 passed
pytest tests/integration/test_images.py
# 1 failed in 2697.61s (0:44:57)
cat test_images_summary.mdTable of test resultsDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
To get the web path, replace with For the non-legacy The diffs in
and the zppy PRs merged after 7/10:
@chengzhuzhang From a cursory look, the diurnal plots no longer have missing coloring at top/bottom, the Taylor Diagrams have lost data, the TC analysis plots have turned to a green background, the polar plots have some noticeable diffs in the diff plots. And as usual, some diffs are the annoyance of matplotlib plotting things just different enough for the test to pick it up. Given the above, I think we'll want to dive deeper into the diffs before updating the expected test results. |
Beta Was this translation helpful? Give feedback.
-
2025-09-19 ThreadTesting processExpandlcrc_conda
cd ~/ez/e3sm_diags
git status
# Check for uncommitted changes
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Good, matches https://github.com/E3SM-Project/e3sm_diags/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250919
conda activate e3sm-diags-main-20250919
python -m pip install .
cd ~/ez/zppy-interfaces
git status
# Check for uncommitted changes
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Good, matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
rm -rf build
time conda clean --all --y
# Will remove 19 (1.5 MB) tarball(s).
# Will remove 1 index cache(s).
# Will remove 17 (4.7 MB) package(s).
#
# Hanging -- this command isn't strictly necessary, so let's just skip it.
# CTRL C
# real 28m13.056s
time conda env create -f conda/dev.yml -n zi-main-20250919 # real 31m10.990s
conda activate zi-main-20250919
time python -m pip install . # real 0m32.155s
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 55.64s
cd ~/ez/zppy
git status
# Check for uncommitted changes
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Good, matches https://github.com/E3SM-Project/zppy/commits/main
rm -rf build
time conda clean --all --y
# CTRL C
# real 13m11.526s
time conda env create -f conda/dev.yml -n zppy-main-20250919 # real 18m26.351s
conda activate zppy-main-20250919
pytest tests/test_*.py
# 25 passed, 1 warning in 3.51s
# (warning is ok, and has appeared on `main` anyway)
# Fix that warning
time python -m pip install .
pytest tests/test_*.py
# 25 passed in 0.51s
# Add the tasks_to_run_copy fix to test_images
time python -m pip install .
pre-commit run --all-files
pytest tests/test_*.py
# 25 passed in 0.43s
# Edit tests/integration/utils.py:
# TEST_SPECIFICS: Dict[str, Any] = {
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250919",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20250919",
# "cfgs_to_run": [
# "weekly_bundles",
# "weekly_comprehensive_v2",
# "weekly_comprehensive_v3",
# "weekly_legacy_3.0.0_bundles",
# "weekly_legacy_3.0.0_comprehensive_v2",
# "weekly_legacy_3.0.0_comprehensive_v3",
# ],
# "tasks_to_run": ["e3sm_diags", "mpas_analysis", "global_time_series", "ilamb"],
# "unique_id": "weekly_test_20250919",
# }
python tests/integration/utils.py
pre-commit run --all-files
time python -m pip install .
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
# Check on bundles status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Now, run bundles part 2
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
# Review finished runs
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# global_time_series_1980-1990.status:WAITING 905818
# mpas_analysis_ts_1980-1984_climo_1980-1984.status:ERROR (1)
# mpas_analysis_ts_1980-1990_climo_1985-1990.status:WAITING 905817
emacs mpas_analysis_ts_1980-1984_climo_1980-1984.o905816
# There were errors in task climatologyMapMassFluxes: plot_seaIceFreshWaterFlux_ANN_arctic
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# global_time_series_1980-1990.status:WAITING 905881
# mpas_analysis_ts_1980-1984_climo_1980-1984.status:ERROR (1)
# mpas_analysis_ts_1980-1990_climo_1985-1990.status:WAITING 905880
emacs mpas_analysis_ts_1980-1984_climo_1980-1984.o905879
# ERROR in task climatologyMapSeaIceConcNH: plotJFM_arctic_extended_Bootstrap. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comp\
# rehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMa\
# pSeaIceConcNH_plotJFM_arctic_extended_Bootstrap.log for details
#
# There were errors in 5 tasks: climatologyMapMassFluxes: plot_evaporationFlux_ANN_antarctic, climatologyMapArgoTemperature: plotANN_latlon_depth_-150,\
# climatologyMapArgoTemperature: plotJAS_latlon_depth_-150, climatologyMapArgoSalinity: plotANN_latlon_depth_-400, climatologyMapSeaIceConcNH: plotJFM\
# _arctic_extended_Bootstrap
# See log files in /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/script\
# s/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs for details.
# The following commands may be helpful:
# cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysi\
# s/mpas_analysis/ts_1980-1984_climo_1980-1984/logs
# grep Error *.log
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# global_time_series_1985-1995.status:WAITING 905911
# mpas_analysis_ts_1985-1989_climo_1985-1989.status:ERROR (1)
# mpas_analysis_ts_1985-1995_climo_1990-1995.status:WAITING 905910
emacs mpas_analysis_ts_1985-1989_climo_1985-1989.o905909
# ERROR in task climatologyMapSeaIceConcSH: plotJJA_antarctic_extended_Bootstrap. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_c\
# omprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatolog\
# yMapSeaIceConcSH_plotJJA_antarctic_extended_Bootstrap.log for details
#
# There were errors in 14 tasks: climatologyMapMassFluxes: plot_evaporationFlux_JFM_latlon, climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-\
# 800, climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-2000, climatologyMapWoa: plotTemperature_JFM_latlon_depth_-1000, climatologyMapArgoTe\
# mperature: plotANN_latlon_depth_-50, climatologyMapArgoTemperature: plotJFM_latlon_depth_-100, climatologyMapArgoSalinity: plotANN_latlon_depth_-25, \
# climatologyMapArgoSalinity: plotJFM_latlon_depth_-200, climatologyMapArgoSalinity: plotJAS_latlon_depth_-150, climatologyMapArgoSalinity: plotJAS_lat\
# lon_depth_-200, climatologyMapArgoSalinity: plotJAS_latlon_depth_-600, climatologyMapMLD: plotJFM_latlon, osnapTransects: plotANN_OSNAP_West_salinity\
# , climatologyMapSeaIceConcSH: plotJJA_antarctic_extended_Bootstrap
# See log files in /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/script\
# s/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs for details.
# The following commands may be helpful:
# cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysi\
# s/mpas_analysis/ts_1985-1989_climo_1985-1989/logs
# grep Error *.log
# The 6 WAITING jobs account for the 6 jobs remaining on `squeue`; cancel those
# Review the completed bundles jobs
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Let's proceed with the test suite to see if any more errors exist.
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 2.05s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_campaign.py
# 6 failed in 6.72s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_defaults.py
# 1 failed in 5.67s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_last_year.py
# 1 failed in 1.68s, diff is because the prov file is always unique
pytest tests/integration/test_bundles.py
# 2 passed in 2.72s
pytest tests/integration/test_images.py
# Copy the output of test_images_summary.md to a Pull Request comment
# ============================================================== short test summary info ===============================================================
# FAILED tests/integration/test_images.py::test_images - assert 3031 == 2949
# =========================================================== 1 failed in 2671.08s (0:44:31) ===========================================================
cat test_images_summary.mdComplete testing summaryExpandDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
Concise testing summary
Preliminary analysis
Initial debugging of `mpas_analysis` logsgrep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/mpas_analysis_ts_1980-1984_climo_1980-1984.o905816
# 102:ERROR in task climatologyMapMassFluxes: plot_seaIceFreshWaterFlux_ANN_arctic. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapMassFluxes_plot_seaIceFreshWaterFlux_ANN_arctic.log for details
# 481:There were errors in task climatologyMapMassFluxes: plot_seaIceFreshWaterFlux_ANN_arctic
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapMassFluxes_plot_seaIceFreshWaterFlux_ANN_arctic.log
# 30: raise ValueError(
# 31:ValueError: x and y arguments to pcolormesh cannot have non-finite values or be of type numpy.ma.MaskedArray with masked values
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/mpas_analysis_ts_1980-1984_climo_1980-1984.o905879
# 101:ERROR in task climatologyMapMassFluxes: plot_evaporationFlux_ANN_antarctic. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapMassFluxes_plot_evaporationFlux_ANN_antarctic.log for details
# 234:ERROR in task climatologyMapArgoTemperature: plotANN_latlon_depth_-150. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoTemperature_plotANN_latlon_depth_-150.log for details
# 241:ERROR in task climatologyMapArgoTemperature: plotJAS_latlon_depth_-150. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoTemperature_plotJAS_latlon_depth_-150.log for details
# 255:ERROR in task climatologyMapArgoSalinity: plotANN_latlon_depth_-400. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoSalinity_plotANN_latlon_depth_-400.log for details
# 541:ERROR in task climatologyMapSeaIceConcNH: plotJFM_arctic_extended_Bootstrap. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapSeaIceConcNH_plotJFM_arctic_extended_Bootstrap.log for details
# 543:There were errors in 5 tasks: climatologyMapMassFluxes: plot_evaporationFlux_ANN_antarctic, climatologyMapArgoTemperature: plotANN_latlon_depth_-150, climatologyMapArgoTemperature: plotJAS_latlon_depth_-150, climatologyMapArgoSalinity: plotANN_latlon_depth_-400, climatologyMapSeaIceConcNH: plotJFM_arctic_extended_Bootstrap
# 547: grep Error *.log
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapMassFluxes_plot_evaporationFlux_ANN_antarctic.log
# 24: raise ValueError(
# 25:ValueError: x and y arguments to pcolormesh cannot have non-finite values or be of type numpy.ma.MaskedArray with masked values
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoTemperature_plotANN_latlon_depth_-150.log
# 24: raise ValueError(
# 25:ValueError: x and y arguments to pcolormesh cannot have non-finite values or be of type numpy.ma.MaskedArray with masked values
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoTemperature_plotJAS_latlon_depth_-150.log
# 24: raise ValueError(
# 25:ValueError: x and y arguments to pcolormesh cannot have non-finite values or be of type numpy.ma.MaskedArray with masked values
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapArgoSalinity_plotANN_latlon_depth_-400.log
# 24: raise ValueError(
# 25:ValueError: x and y arguments to pcolormesh cannot have non-finite values or be of type numpy.ma.MaskedArray with masked values
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250919/v2.LR.historical_0201/post/scripts/../analysis/mpas_analysis/ts_1980-1984_climo_1980-1984/logs/climatologyMapSeaIceConcNH_plotJFM_arctic_extended_Bootstrap.log
# 31:MemoryError: std::bad_alloc
grep -in error /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/mpas_analysis_ts_1985-1989_climo_1985-1989.o905909
# 103:ERROR in task climatologyMapMassFluxes: plot_evaporationFlux_JFM_latlon. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapMassFluxes_plot_evaporationFlux_JFM_latlon.log for details
# 230:ERROR in task climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-800. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapVel_plotVelocityMagnitude_ANN_latlon_depth_-800.log for details
# 234:ERROR in task climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-2000. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapVel_plotVelocityMagnitude_ANN_latlon_depth_-2000.log for details
# 266:ERROR in task climatologyMapWoa: plotTemperature_JFM_latlon_depth_-1000. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapWoa_plotTemperature_JFM_latlon_depth_-1000.log for details
# 280:ERROR in task climatologyMapArgoTemperature: plotANN_latlon_depth_-50. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoTemperature_plotANN_latlon_depth_-50.log for details
# 301:ERROR in task climatologyMapArgoTemperature: plotJFM_latlon_depth_-100. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoTemperature_plotJFM_latlon_depth_-100.log for details
# 308:ERROR in task climatologyMapArgoSalinity: plotANN_latlon_depth_-25. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoSalinity_plotANN_latlon_depth_-25.log for details
# 323:ERROR in task climatologyMapArgoSalinity: plotJFM_latlon_depth_-200. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoSalinity_plotJFM_latlon_depth_-200.log for details
# 333:ERROR in task climatologyMapArgoSalinity: plotJAS_latlon_depth_-150. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoSalinity_plotJAS_latlon_depth_-150.log for details
# 335:ERROR in task climatologyMapArgoSalinity: plotJAS_latlon_depth_-200. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoSalinity_plotJAS_latlon_depth_-200.log for details
# 338:ERROR in task climatologyMapArgoSalinity: plotJAS_latlon_depth_-600. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapArgoSalinity_plotJAS_latlon_depth_-600.log for details
# 339:ERROR in task climatologyMapMLD: plotJFM_latlon. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapMLD_plotJFM_latlon.log for details
# 399:ERROR in task osnapTransects: plotANN_OSNAP_West_salinity. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/osnapTransects_plotANN_OSNAP_West_salinity.log for details
# 582:ERROR in task climatologyMapSeaIceConcSH: plotJJA_antarctic_extended_Bootstrap. See log file /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250919/v3.LR.historical_0051/post/scripts/../analysis/mpas_analysis/ts_1985-1989_climo_1985-1989/logs/climatologyMapSeaIceConcSH_plotJJA_antarctic_extended_Bootstrap.log for details
# 584:There were errors in 14 tasks: climatologyMapMassFluxes: plot_evaporationFlux_JFM_latlon, climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-800, climatologyMapVel: plotVelocityMagnitude_ANN_latlon_depth_-2000, climatologyMapWoa: plotTemperature_JFM_latlon_depth_-1000, climatologyMapArgoTemperature: plotANN_latlon_depth_-50, climatologyMapArgoTemperature: plotJFM_latlon_depth_-100, climatologyMapArgoSalinity: plotANN_latlon_depth_-25, climatologyMapArgoSalinity: plotJFM_latlon_depth_-200, climatologyMapArgoSalinity: plotJAS_latlon_depth_-150, climatologyMapArgoSalinity: plotJAS_latlon_depth_-200, climatologyMapArgoSalinity: plotJAS_latlon_depth_-600, climatologyMapMLD: plotJFM_latlon, osnapTransects: plotANN_OSNAP_West_salinity, climatologyMapSeaIceConcSH: plotJJA_antarctic_extended_Bootstrap
# 588: grep Error *.log
# That's a lot to check; presumably we're running into the same issues as above.Minor fixesAs part of this round of testing, I do have a commit with some fixes that I'd like to merge: #736 Details# Edit tests/integration/utils.py
# Reset TEST_SPECIFICS
python tests/integration/utils.py
pre-commit run --all-files
# Looks like we're still on the `main` branch, so let's checkout a new one.
git checkout -b test-fixes
git add -A
git commit -m "Unit and integration test fixes"
git push upstream test-fixes
# Push commit for test fixes
# Make PR: https://github.com/E3SM-Project/zppy/pull/736 |
Beta Was this translation helpful? Give feedback.
-
|
@forsyth2 thanks for the testing results. @xylar could you help invest the mpas_analysis task? |
Beta Was this translation helpful? Give feedback.
-
2025-09-26 ThreadTesting processExpandlcrc_conda
cd ~/ez/e3sm_diags
git status
# Check for uncommitted changes
# Branch: main
git fetch upstream main
git reset --hard upstream/main
git log
# Last commit: fixing clevs for tc; an error that misses one storm (#1009
# Good, matches https://github.com/E3SM-Project/e3sm_diags/commits/main
rm -rf build
conda clean --all --y # ~7 min to run
time conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250926 # ~3 min to run
conda activate e3sm-diags-main-20250926
python -m pip install .
cd ~/ez/zppy-interfaces
git status
# Check for uncommitted changes
# Branch: add-pcmdi => Need to change branches
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Last commit: Add missing packages to dependencies (#32)
# Good, matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
rm -rf build
conda clean --all --y # ~6 min to run
time conda env create -f conda/dev.yml -n zi-main-20250926 # ~3 min to run
conda activate zi-main-20250926
python -m pip install .
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 24.22s
cd ~/ez/zppy
git status
# Check for uncommitted changes
# Branch: zppy_pcmdi_202505_rev => Need to change branches
git fetch upstream main
git checkout -b test-zppy-main-20250926 upstream/main
git log
# Last commit: Unit and integration test fixes (#736)
# Good, matches https://github.com/E3SM-Project/zppy/commits/main
rm -rf build
conda clean --all --y # ~7 min to run
time conda env create -f conda/dev.yml -n zppy-main-20250926 # real 18m26.351s
conda activate zppy-main-20250926
pytest tests/test_*.py
# 25 passed in 0.72s
# Edit tests/integration/utils.py:
# TEST_SPECIFICS: Dict[str, Any] = {
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250926",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20250926",
# "cfgs_to_run": [
# "weekly_bundles",
# "weekly_comprehensive_v2",
# "weekly_comprehensive_v3",
# "weekly_legacy_3.0.0_bundles",
# "weekly_legacy_3.0.0_comprehensive_v2",
# "weekly_legacy_3.0.0_comprehensive_v3",
# ],
# "tasks_to_run": ["e3sm_diags", "mpas_analysis", "global_time_series", "ilamb"],
# "unique_id": "weekly_test_20250926",
# }
python tests/integration/utils.py
pre-commit run --all-files
python -m pip install .
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
# Check on bundles status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Now, run bundles part 2
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
# Review finished runs
### v2 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250926/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# global_time_series_1980-1990.status:ERROR (8)
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250926/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# global_time_series_1980-1990.status:ERROR (8)
### v3 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
### bundles ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250926/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Let's review those global_time_series errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250926/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# KeyError: 'rgn'
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250926/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# KeyError: 'rgn'
# That's the same error that appeared in the pcmdi_diags test,
# so it looks like the problem is in fact on `main`.
# Let's proceed with the test suite to see if any more errors exist.
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 1.95s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_campaign.py
# 6 failed in 2.16s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_defaults.py
# 1 failed in 0.34s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_last_year.py
# 1 failed in 0.34s, diff is because the prov file is always unique
pytest tests/integration/test_bundles.py
# 2 passed in 0.22s
pytest tests/integration/test_images.py
# Copy the output of test_images_summary.md to a Pull Request comment
# =================================================================== short test summary info ===================================================================
# FAILED tests/integration/test_images.py::test_images - assert 3031 == 2745
# =============================================================== 1 failed in 2803.42s (0:46:43) ================================================================
cat test_images_summary.mdComplete testing summaryExpandDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
Concise testing summary
Preliminary analysis
|
Beta Was this translation helpful? Give feedback.
-
|
@forsyth2 thanks for the weekly testing. It appears that all the missing images from e3sm diags are model vs model plots for land variables. The plots are not missing instead because of the new PR to add land vars and grouping into categories based on land var csv, the images are saved in a different directory structure (e.g., viewer/lat_lon_land/veg-state/btran-global/ann.html). The mis-matched images are expected and can be explained by the code changes between two versions. |
Beta Was this translation helpful? Give feedback.
-
2025-09-29 -- rerun of the 2025-09-26 testTesting ProcessExpandlcrc_conda
cd ~/ez/e3sm_diags
git status
# Check for uncommitted changes
# Branch: main
git fetch upstream main
git reset --hard upstream/main
git log
# Last commit (from today Mon 9/29): Add ARs and ETCs detection script (#997)
# To make sure we're producing exactly the same plots as before,
# let's use the code as of Friday 9/26.
git reset --hard 4c2908d336b59104a8b5a4af74ec02c6509b6a22
git log
# Last commit (from Wed 9/24): fixing clevs for tc; an error that misses one storm (#1009
# Good, matches https://github.com/E3SM-Project/e3sm_diags/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20250926-rerun
conda activate e3sm-diags-main-20250926-rerun
python -m pip install .
cd ~/ez/zppy-interfaces
git status
# Check for uncommitted changes
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Last commit (8/11): Add missing packages to dependencies (#32)
# Good, matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zi-main-20250926-rerun
conda activate zi-main-20250926-rerun
python -m pip install .
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 24.96s
cd ~/ez/zppy
git status
# Check for uncommitted changes
# Branch: zppy_pcmdi_202505_rev => Need to change branches
git fetch upstream main
git checkout -b test-zppy-main-20250926-rerun upstream/main
git log
# Last commit: Unit and integration test fixes (#736)
# Good, matches https://github.com/E3SM-Project/zppy/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-main-20250926-rerun
conda activate zppy-main-20250926-rerun
pytest tests/test_*.py
# 25 passed in 0.49s
# Edit tests/integration/utils.py:
# TEST_SPECIFICS: Dict[str, Any] = {
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20250926-rerun",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20250926-rerun",
# "cfgs_to_run": [
# "weekly_bundles",
# "weekly_comprehensive_v2",
# "weekly_comprehensive_v3",
# "weekly_legacy_3.0.0_bundles",
# "weekly_legacy_3.0.0_comprehensive_v2",
# "weekly_legacy_3.0.0_comprehensive_v3",
# ],
# "tasks_to_run": ["e3sm_diags", "mpas_analysis", "global_time_series", "ilamb"],
# "unique_id": "weekly_test_20250926_rerun",
# }
python tests/integration/utils.py
pre-commit run --all-files
python -m pip install .
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
# ~2 hours to run
# Check on bundles status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Now, run bundles part 2
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
# Review complete runs
### v2 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20250926_rerun/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20250926_rerun/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
### v3 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
### bundles ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20250926_rerun/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Good, no errors in any of the output directories!
# Now, the tests themselves:
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 6.51s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_campaign.py
# 6 failed in 4.07s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_defaults.py
# 1 failed in 2.80s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_last_year.py
# 1 failed in 0.72s, diff is because the prov file is always unique
pytest tests/integration/test_bundles.py
# 2 passed in 0.46s
hostname
# chrlogin1.lcrc.anl.gov
screen # Activate screen
lcrc_conda
conda activate zppy-main-20250926-rerun
pytest tests/integration/test_images.py 2>&1 | tee out.txt
# CTRL A D
# [detached from 3331906.pts-26.chrlogin1]
screen -ls
# There is a screen on:
# 3331906.pts-26.chrlogin1 (Detached)
# 1 Socket in /run/screen/S-ac.forsyth2.
tail out.txt
# Good, it's writing to the output file.
###############################################################################
# Picking up 9/30
hostname
# chrlogin1.lcrc.anl.gov
screen -ls
# There is a screen on:
# 3331906.pts-26.chrlogin1 (Detached)
# 1 Socket in /run/screen/S-ac.forsyth2.
screen -R
# Number of missing images: 0
# Number of mismatched images: 0
# Number of correct images: 3
# web_portal_base_path: /lcrc/group/e3sm/public_html/diagnostic_output
# web_portal_base_url: https://web.lcrc.anl.gov/public/e3sm/diagnostic_output
# Making image diff grid for /lcrc/group/e3sm/public_html/diagnostic_output/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/global_time_series
# Saving to:
# /lcrc/group/e3sm/public_html/diagnostic_output/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/global_time_series/image_diff_grid.pdf
# Web page will be at:
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output//ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/global_time_series/image_diff_grid.pdf
# Opening expected images file /lcrc/group/e3sm/public_html/zppy_test_resources/image_list_expected_legacy_3.0.0_bundles.txt
# Reading expected images file /lcrc/group/e3sm/public_html/zppy_test_resources/image_list_expected_legacy_3.0.0_bundles.txt
# On line # 250
# Total: 388
# Number of missing images: 0
# Number of mismatched images: 0
# Number of correct images: 388
# web_portal_base_path: /lcrc/group/e3sm/public_html/diagnostic_output
# web_portal_base_url: https://web.lcrc.anl.gov/public/e3sm/diagnostic_output
# Making image diff grid for /lcrc/group/e3sm/public_html/diagnostic_output/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/ilamb
# Saving to:
# /lcrc/group/e3sm/public_html/diagnostic_output/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/ilamb/image_diff_grid.pdf
# Web page will be at:
# https://web.lcrc.anl.gov/public/e3sm/diagnostic_output//ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_www/weekly_test_20250926_rerun/v3.LR.historical_0051/image_check_failures_legacy_3.0.0_bundles/ilamb/image_diff_grid.pdf
# Copy the output of test_images_summary.md to a Pull Request comment
# =========================== short test summary info ============================
# FAILED tests/integration/test_images.py::test_images - assert 3031 == 2745
# ======================== 1 failed in 3351.13s (0:55:51) ========================
exit # Exit screen
cd ~/ez/zppy
tail -n 4 out.txt
# Copy the output of test_images_summary.md to a Pull Request comment
# =========================== short test summary info ============================
# FAILED tests/integration/test_images.py::test_images - assert 3031 == 2745
# ======================== 1 failed in 3351.13s (0:55:51) ========================
cat test_images_summary.mdComplete testing summaryExpandDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
Concise testing summaryDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
NOTE: These links no longer work because I've since updated the expected results and rerun the tests. That is, there are no longer any errors reported. AnalysisComparing these results with the original run, we see:
That is, these results are all good. We are now able to update the expected results! Updating expected resultsExpand# First, let's copy over the old expected results
cd /lcrc/group/e3sm/public_html/
# Using 20250926 because we're testing off code as if it was from that day.
# (Recall that e3sm_diags had a new commit on 9/29)
cp -r zppy_test_resources/ zppy_test_resources_previous/expected_results_until_20250926 # ~14 min to run
# Check old expected results were copied over correctly, so we have a record:
ls zppy_test_resources_previous/expected_results_until_20250926
# Now, make new expected results for all the tests
cd ~/ez/zppy
git status
# Only one of the update scripts has a diff:
# tests/integration/generated/update_weekly_expected_files_chrysalis.sh
# That file uses multiple `#expand` parameters.
# The others only use `#expand` for the expected dir, which is consistent.
# template_update_{bash_generation, campaign, defaults}_expected_files.sh
# Let's update the simpler tests' results first:
ls tests/integration/test_*.py
# tests/integration/test_bash_generation.py tests/integration/test_defaults.py
# tests/integration/test_bundles.py tests/integration/test_images.py
# tests/integration/test_campaign.py tests/integration/test_last_year.py
lcrc_conda
conda activate zppy-main-20250926-rerun
./tests/integration/generated/update_bash_generation_expected_files_chrysalis.sh
# Only in test_bash_generation_output/post/scripts: provenance.20250930_164307_367483.cfg
# 1 failed in 1.60s
# As noted in https://github.com/E3SM-Project/zppy/discussions/634#discussioncomment-13726657,
# `zppy -c tests/integration/test_bash_generation.cfg` is going to create a new provenance each time!
# https://github.com/E3SM-Project/zppy/issues/728 was created to address this at some point.
./tests/integration/generated/update_campaign_expected_files_chrysalis.sh
# Same issue as above
# 6 failed in 5.45s
./tests/integration/generated/update_defaults_expected_files_chrysalis.sh
# Same issue as above
# 1 failed in 0.84s
# test_last_year doesn't need an update script because its cfg is stand-alone:
pytest -vv tests/integration/test_last_year.py
# Actual has provenance.20250617_010036_322732.cfg, whereas Expected doesn't have a prov cfg at all.
# That appears to be because the directory `zppy/test_last_year_output/` was never being deleted.
# Made some changes to the files
# 1 passed in 1.09s
# Let's edit the other test files accordingly.
./tests/integration/generated/update_defaults_expected_files_chrysalis.sh
# 1 passed in 1.22s
./tests/integration/generated/update_campaign_expected_files_chrysalis.sh
# 6 passed in 1.50s
./tests/integration/generated/update_bash_generation_expected_files_chrysalis.sh
# For this one, we needed to remove the prov cfg in the expected results as well.
# Default & campaign only copied over specific files,
# but this one copies over the entire directory.
# 1 passed in 1.01s
# Also updated test_images.py to make sure the tasks_to_run copy is always defined.
# Now update the expected files for the major tests:
./tests/integration/generated/update_weekly_expected_files_chrysalis.sh # 10:29
# For these, we rerun the tests in distinct commands:
pytest tests/integration/test_bundles.py
# 2 passed in 0.88s
pytest tests/integration/test_images.py # 10:55
# 1 passed in 3785.53s (1:03:05)
# That's ~20 minutes longer than before
cat test_images_summary.md
# No errors to reportAll expected results have been updated. The previous expected results have been moved to an archive. All integraiton & unit tests now pass. Some fixes to mergeExpandgit status
# Reset TEST_SPECIFICS in tests/integration/utils.py:
git diff tests/integration/utils.py
# All diffs were part of TEST_SPECIFICS, so we can just restore that file
git restore tests/integration/utils.py
git diff tests/integration/utils.py
# Now, no diffs
# Reset the generated files:
python tests/integration/utils.py
git status
rm -rf out.txt www/
git diff
# All remaining diffs are the test fixes
git add -A
# Change the branch name
# We'reresolving https://github.com/E3SM-Project/zppy/issues/728
git branch -m fix-issue-728
pre-commit run --all-files
git add -A
git commit -m "Integration test fixes"
git push upstream fix-issue-728I added some fixes on the testing side: #737 |
Beta Was this translation helpful? Give feedback.
-
2025-10-03 ThreadWhat changes are we testing?
Testing processExpandlcrc_conda
cd ~/ez/e3sm_diags
git status
# branch: main
# nothing to commit
git fetch upstream main
git reset --hard upstream/main
git log
# Last commit: Add ARs and ETCs detection script (#997)
# Good, matches https://github.com/E3SM-Project/e3sm_diags/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20251003
conda activate e3sm-diags-main-20251003
python -m pip install .
cd ~/ez/zppy-interfaces
git status
# branch: add-pcmdi
# nothing to commit
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Last commit: Comment out docs build (#36)
# Good, matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zi-main-20251003
conda activate zi-main-20251003
python -m pip install .
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 24.03s
cd ~/ez/zppy
# branch: zppy_pcmdi_202505_rev
# Uncommitted changes
git add -A
git commit -m "Testing" --no-verify
git fetch upstream main
git checkout -b test-zppy-main-20251003 upstream/main
git log
# Last commit: Switch packaging to pyproject.toml (#739)
# Good, matches https://github.com/E3SM-Project/zppy/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-main-20251003
conda activate zppy-main-20251003
pytest tests/test_*.py
# 25 passed in 1.39s
# Edit tests/integration/utils.py:
# TEST_SPECIFICS: Dict[str, Any] = {
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20251003",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20251003",
# "cfgs_to_run": [
# "weekly_bundles",
# "weekly_comprehensive_v2",
# "weekly_comprehensive_v3",
# "weekly_legacy_3.0.0_bundles",
# "weekly_legacy_3.0.0_comprehensive_v2",
# "weekly_legacy_3.0.0_comprehensive_v3",
# ],
# "tasks_to_run": ["e3sm_diags", "mpas_analysis", "global_time_series", "ilamb"],
# "unique_id": "weekly_test_20251003",
# }
python tests/integration/utils.py
pre-commit run --all-files
python -m pip install .
# Add the commit from https://github.com/E3SM-Project/zppy/pull/738
git fetch upstream further-test-fixes
git cherry-pick b0132f0600cb6f867e6f45b37f2acf2d32a49c59
python tests/integration/utils.py
pre-commit run --all-files
python -m pip install .
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
# Check on bundles status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Now, run bundles part 2
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
# Review finished runs
### v2 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20251003/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20251003/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
### v3 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
### bundles ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20251003/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 2.27s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_campaign.py
# 6 passed in 2.09s
pytest tests/integration/test_defaults.py
# 1 passed in 0.44s
pytest tests/integration/test_last_year.py
# 1 passed in 0.27s
pytest tests/integration/test_bundles.py
# 2 passed in 1.05s
pytest tests/integration/test_images.py
# 1 failed in 3367.01s (0:56:07)
# Copy the output of test_images_summary.md to a Pull Request comment
# =================================================================== short test summary info ===================================================================
# FAILED tests/integration/test_images.py::test_images - assert 3031 == 2745
# =============================================================== 1 failed in 2803.42s (0:46:43) ================================================================
cat test_images_summary.mdComplete resultsExpandDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
Concise results
This test fails because we changed the v3 test cases, so this is expected. Let's take a look manually:
All these look good. Updating expected resultsExpand# First, let's copy over the old expected results
cd /lcrc/group/e3sm/public_html/
cp -r zppy_test_resources/ zppy_test_resources_previous/expected_results_until_20251003
# Check old expected results were copied over correctly, so we have a record:
ls zppy_test_resources_previous/expected_results_until_20251003
# Now, make new expected results for all the tests
cd ~/ez/zppy
git status
# Only one of the update scripts has a diff:
# tests/integration/generated/update_weekly_expected_files_chrysalis.sh
# That file uses multiple `#expand` parameters.
# The others only use `#expand` for the expected dir, which is consistent.
# template_update_{bash_generation, campaign, defaults}_expected_files.sh
# Let's update the simpler tests' results first:
ls tests/integration/test_*.py
# tests/integration/test_bash_generation.py tests/integration/test_defaults.py
# tests/integration/test_bundles.py tests/integration/test_images.py
# tests/integration/test_campaign.py tests/integration/test_last_year.py
./tests/integration/generated/update_bash_generation_expected_files_chrysalis.sh
# 1 passed in 1.72s
./tests/integration/generated/update_campaign_expected_files_chrysalis.sh
# 6 failed in 1.86s
# Just rerun it; it was expecting files to exist...
# (The test output directories get removed if the test was successful,
# so there was nothing for the update script to take as new results).
./tests/integration/generated/update_campaign_expected_files_chrysalis.sh
# 6 passed in 2.07s
./tests/integration/generated/update_defaults_expected_files_chrysalis.sh
# 1 failed in 0.92s
# Same as above, just rerun:
./tests/integration/generated/update_defaults_expected_files_chrysalis.sh
# 1 passed in 0.43s
# test_last_year doesn't need an update script because its cfg is stand-alone:
pytest -vv tests/integration/test_last_year.py
# 1 passed in 0.36s
# Now update the expected files for the major tests:
./tests/integration/generated/update_weekly_expected_files_chrysalis.sh # < 34 minutes
# For these, we rerun the tests in distinct commands:
pytest tests/integration/test_bundles.py
# 2 passed in 1.17s
pytest tests/integration/test_images.py
# 1 passed in 3460.87s (0:57:40)
cat test_images_summary.md
# No errors to reportAll expected results have been updated. The previous expected results have been moved to an archive. All integraiton & unit tests now pass. This also served as another test of the testing fix commit at #738. |
Beta Was this translation helpful? Give feedback.
-
2025-10-13 ThreadWe're testing the What changes are we testing?
Testing processExpandlcrc_conda
cd ~/ez/e3sm_diags
git status
# branch: main
# nothing to commit
git fetch upstream main
git reset --hard upstream/main
git log
# Last commit: Bump to 3.1.0rc1 (#1010)
# Good, matches https://github.com/E3SM-Project/e3sm_diags/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda-env/dev.yml -n e3sm-diags-main-20251013
conda activate e3sm-diags-main-20251013
python -m pip install .
cd ~/ez/zppy-interfaces
git status
# branch: pcmdi-updates
# nothing to commit
git fetch upstream main
git checkout main
git reset --hard upstream/main
git log
# Last commit: Further pcmdi_diags updates (#41)
# Good, matches https://github.com/E3SM-Project/zppy-interfaces/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zi-main-20251013
conda activate zi-main-20251013
python -m pip install .
pytest tests/unit/global_time_series/test_*.py
# 10 passed in 27.72s
pytest tests/unit/pcmdi_diags/test_*.py
# 7 passed, 2 warnings in 16.56s
cd ~/ez/zppy
git status
# branch: pcmdi-updates
# nothing to commit
git fetch upstream main
git checkout -b test-zppy-main-20251013 upstream/main
git log
# Last commit: Merge pull request #742 from E3SM-Project/pcmdi-updates
# Good, matches https://github.com/E3SM-Project/zppy/commits/main
rm -rf build
conda clean --all --y
conda env create -f conda/dev.yml -n zppy-main-20251013
conda activate zppy-main-20251013
pytest tests/test_*.py
# 35 passed in 1.27s
# Edit tests/integration/utils.py:
# TEST_SPECIFICS: Dict[str, Any] = {
# "diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate e3sm-diags-main-20251013",
# "global_time_series_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20251013",
# "pcmdi_diags_environment_commands": "source /gpfs/fs1/home/ac.forsyth2/miniforge3/etc/profile.d/conda.sh; conda activate zi-main-20251013",
# "cfgs_to_run": [
# "weekly_bundles",
# "weekly_comprehensive_v2",
# "weekly_comprehensive_v3",
# "weekly_legacy_3.0.0_bundles",
# "weekly_legacy_3.0.0_comprehensive_v2",
# "weekly_legacy_3.0.0_comprehensive_v3",
# ],
# "tasks_to_run": [
# "e3sm_diags",
# "mpas_analysis",
# "global_time_series",
# "ilamb",
# "pcmdi_diags",
# ],
# "unique_id": "weekly_test_20251013",
# }
python tests/integration/utils.py
pre-commit run --all-files
python -m pip install .
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_comprehensive_v3_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v2_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_comprehensive_v3_chrysalis.cfg
# zppy.utils.ParameterNotProvidedError: cmip_plevdata was not provided, and inferring is turned off. Turn on inferring by setting infer_path_parameters to True.
# The turned off Parameter inference breaks backwards compatibility.
# Check on bundles status
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# Now, run bundles part 2
cd ~/ez/zppy
zppy -c tests/integration/generated/test_weekly_bundles_chrysalis.cfg
zppy -c tests/integration/generated/test_weekly_legacy_3.0.0_bundles_chrysalis.cfg
# Review finished runs
### v2 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v2_output/weekly_test_20251013/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v2_output/weekly_test_20251013/v2.LR.historical_0201/post/scripts
grep -v "OK" *status
# Good, no errors
### v3 ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_comprehensive_v3_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_comprehensive_v3_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
# However:
ls *.status
# Only the climo and ts tasks ran because of the launch problem re: path inference
### bundles ###
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_bundles_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd /lcrc/group/e3sm/ac.forsyth2/zppy_weekly_legacy_3.0.0_bundles_output/weekly_test_20251013/v3.LR.historical_0051/post/scripts
grep -v "OK" *status
# Good, no errors
cd ~/ez/zppy
ls tests/integration/test_*.py
pytest tests/integration/test_bash_generation.py
# 1 failed in 1.92s, diffs appear to be expected based on recent changes on `main`
pytest tests/integration/test_campaign.py
# 6 passed in 2.09s, same as above
pytest tests/integration/test_defaults.py
# 1 failed in 0.50s, same as above
pytest tests/integration/test_last_year.py
# 1 passed in 0.32s
pytest tests/integration/test_bundles.py
# 2 passed in 0.28s
emacs tests/integration/utils.py
# Comment out legacy test that failed to launch
# Comment out pcmdi_diags since we don't have expected results yet
# # "weekly_legacy_3.0.0_comprehensive_v3",
# #"pcmdi_diags",
pytest tests/integration/test_images.py
# 1 passed in 2500.69s (0:41:40)
cat test_images_summary.mdComplete resultsExpandDiff subdir is where to find the lists of missing/mismatched images, the image diff grid, and the individual diffs.
Concise resultsNo explicit test failures. Two things to note:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Starting this page to log the process/results for the weekly runs of
zppy's integration testing: bundles, comprehensive_v2, comprehensive_v3Beta Was this translation helpful? Give feedback.
All reactions