Skip to content

Conversation

@ndkeen
Copy link
Contributor

@ndkeen ndkeen commented Sep 5, 2025

After the Aug20 NERSC maintenance, we needed to remove the module load of evp-patch.
This PR does this and updates machine file to be more consistent with master for pm-cpu/pm-gpu while staying BFB.
The bulk of the changes are for internal test machines (alvarez/muller) which helps to have for testing.

[BFB]

And various updates to machine files to stay consistent with master
@ndkeen ndkeen self-assigned this Sep 5, 2025
@ndkeen ndkeen added Machine Files pm-gpu Perlmutter machine at NERSC (GPU nodes) pm-cpu Perlmutter at NERSC (CPU-only nodes) maint-3.0 labels Sep 5, 2025
@ndkeen
Copy link
Contributor Author

ndkeen commented Sep 5, 2025

For pm-cpu, I tested with e3sm_integration and intel, e3sm_developer with gnu.
All tests passed except MOAB which I think needs source changes to build.

For pm-gpu, I used e3sm_scream_v1.
There is a known build issue discussed here:
#6831
The quick patch mentioned there worked fine, so I continued testing with this, but not sure best way to incorporate into maint-3.0.
Update the radiation submodule? Update YAKL? use kokkos-radiation instead?

And there was still one test that failed to build, which is same as:
#6888
Can be fixed by updating PAM.

@ndkeen
Copy link
Contributor Author

ndkeen commented Sep 5, 2025

To verify BFB, I started with a June checkout of maint-3.0 (removed evp-patch module load) and created a set of baselines:

create_test e3sm_prod --generate -b maint30 

then using this PR, I compared with those:

create_test e3sm_prod --compare -b maint30 

all passed

@ndkeen ndkeen merged commit ad6c0e9 into maint-3.0 Sep 5, 2025
6 checks passed
@ndkeen ndkeen deleted the ndk/maint30/perlmutter-env-update-after-Aug20-nersc-maint branch September 5, 2025 18:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Machine Files maint-3.0 pm-cpu Perlmutter at NERSC (CPU-only nodes) pm-gpu Perlmutter machine at NERSC (GPU nodes)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants