Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
924678a
CDAT Migration Phase 2: Refactor core utilities and `lat_lon` set (#…
tomvothecoder Oct 10, 2023
14126ec
Regression testing for lat_lon variables `NET_FLUX_SRF` and `RESTOM` …
tomvothecoder Dec 5, 2023
bb52037
Update regression test notebook to show validation of all vars
tomvothecoder Dec 5, 2023
b40d57e
Add `subset_and_align_datasets()` to regrid.py (#776)
tomvothecoder Jan 8, 2024
6e40365
Add template run scripts
tomvothecoder Jan 24, 2024
d68b63d
CDAT Migration Phase: Refactor `cosp_histogram` set (#748)
tomvothecoder Jan 30, 2024
8c5c447
CDAT Migration Phase 2: Refactor `zonal_mean_2d()` and `zonal_mean_2d…
tomvothecoder Feb 14, 2024
4531271
Refactor 654 zonal mean xy (#752)
chengzhuzhang Feb 15, 2024
088badf
CDAT Migration - Update run script output directory to NERSC public w…
tomvothecoder Feb 26, 2024
daeaa71
[PR]: CDAT Migration: Refactor `aerosol_aeronet` set (#788)
tomvothecoder Feb 27, 2024
1c3d2f5
CDAT Migration: Test `lat_lon` set with run script and debug any issu…
tomvothecoder Mar 12, 2024
f69ee14
CDAT Migration: Refactor `polar` set (#749)
forsyth2 Mar 14, 2024
e212595
Align order of calls to `_set_param_output_attrs`
tomvothecoder Mar 19, 2024
908089e
CDAT Migration: Refactor `meridional_mean_2d` set (#795)
tomvothecoder Mar 25, 2024
f248bef
CDAT Migration: Refactor `aerosol_budget` (#800)
tomvothecoder May 13, 2024
16bd0ef
Add `acme.py` changes from PR #712 (#814)
tomvothecoder May 13, 2024
80e1bb1
Refactor area_mean_time_series and add ccb slice flag feature (#750)
forsyth2 May 30, 2024
0089118
[Refactor]: Validate fix in PR #750 for #759 (#815)
tomvothecoder Jun 3, 2024
dd8002a
CDAT Migration Phase 2: Refactor `diurnal_cycle` set (#819)
tomvothecoder Jul 22, 2024
3088e0e
CDAT Migration: Refactor annual_cycle_zonal_mean set (#798)
chengzhuzhang Jul 25, 2024
907d2f0
CDAT Migration Phase 2: Refactor `qbo` set (#826)
tomvothecoder Jul 30, 2024
028a20b
CDAT Migration Phase 2: Refactor tc_analysis set (#829)
chengzhuzhang Aug 19, 2024
dda0bd8
CDAT Migration Phase 2: Refactor `enso_diags` set (#832)
tomvothecoder Aug 22, 2024
0da3a27
CDAT Migration Phase 2: Refactor `streamflow` set (#837)
tomvothecoder Aug 26, 2024
48af0a1
[Bug]: CDAT Migration Phase 2: enso_diags plot fixes (#841)
tomvothecoder Aug 26, 2024
c781adb
[Refactor]: CDAT Migration Phase 3: testing and documentation update …
tomvothecoder Sep 27, 2024
d758d3d
CDAT Migration Phase 3 - Port QBO Wavelet feature to Xarray/xCDAT cod…
tomvothecoder Oct 1, 2024
b154585
CDAT Migration Phase 2: Refactor arm_diags set (#842)
chengzhuzhang Oct 1, 2024
d6ee173
Add performance benchmark material (#864)
tomvothecoder Oct 7, 2024
d9feeb4
Add function to add CF axis attr to Z axis if missing for downstream …
tomvothecoder Oct 9, 2024
ff05407
CDAT Migration Phase 3: Add Convective Precipitation Fraction in lat-…
tomvothecoder Oct 25, 2024
c1d529d
CDAT Migration Phase 3: Fix LHFLX name and add catch for non-existent…
tomvothecoder Oct 25, 2024
e268e4d
Add support for time series datasets via glob and fix `enso_diags` se…
tomvothecoder Oct 28, 2024
8031e2f
Add fix for checking `is_time_series()` property based on `data_type`…
tomvothecoder Oct 29, 2024
7550b3d
CDAT migration: Fix African easterly wave density plots in TC analysi…
tomvothecoder Oct 29, 2024
8b97aa6
CDAT Migration: Update `mp_partition_driver.py` to use Dataset from `…
tomvothecoder Oct 31, 2024
43d2df7
CDAT Migration - Port JJB tropical subseasonal diags to Xarray/xCDAT …
tomvothecoder Nov 4, 2024
3406bf2
CDAT Migration: Prepare branch for merge to `main` (#885)
tomvothecoder Nov 5, 2024
cb1cfb2
[Refactor]: CDAT Migration - Update dependencies and remove Dataset._…
tomvothecoder Nov 8, 2024
7b9b9be
[DevOps]: CDAT Migration: Replace `setup.py` with `pyproject.toml` fo…
tomvothecoder Nov 20, 2024
d0b01eb
Add `complete_run_script.py`
tomvothecoder Nov 20, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
3 changes: 3 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[report]
exclude_also =
if TYPE_CHECKING:
8 changes: 4 additions & 4 deletions .github/workflows/build_workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ on:
branches: [main]

pull_request:
branches: [main]
branches: [main, cdat-migration-fy24]

workflow_dispatch:

Expand All @@ -29,10 +29,10 @@ jobs:
uses: actions/checkout@v3

- if: ${{ steps.skip_check.outputs.should_skip != 'true' }}
name: Set up Python 3.10
name: Set up Python 3.11
uses: actions/setup-python@v3
with:
python-version: "3.10"
python-version: "3.11"

- if: ${{ steps.skip_check.outputs.should_skip != 'true' }}
# Run all pre-commit hooks on all the files.
Expand All @@ -50,7 +50,7 @@ jobs:
shell: bash -l {0}
strategy:
matrix:
python-version: ["3.9", "3.10"]
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- id: skip_check
uses: fkirc/skip-duplicate-actions@master
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,7 @@ ENV/

# NetCDF files needed
!e3sm_diags/driver/acme_ne30_ocean_land_mask.nc
!auxiliary_tools/cdat_regression_testing/759-slice-flag/debug/*.nc

# Folder for storing quality assurance files and notes
qa/
5 changes: 3 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
exclude: "docs|node_modules|migrations|.git|.tox|examples|analysis_data_preprocess|auxiliary_tools|conda/meta.yaml|e3sm_diags/driver/utils/zwf_functions.py"
default_stages: [commit]
default_stages: [pre-commit]
fail_fast: true

repos:
Expand Down Expand Up @@ -34,4 +34,5 @@ repos:
hooks:
- id: mypy
args: [--config=pyproject.toml]
additional_dependencies: [dask, numpy>=1.23.0, types-PyYAML]
additional_dependencies:
[dask, numpy>=1.23.0, xarray>=2023.3.0, types-PyYAML]
2 changes: 1 addition & 1 deletion .vscode/e3sm_diags.code-workspace
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"configurations": [
{
"name": "Python: Current File",
"type": "python",
"type": "debugpy",
"request": "launch",
"program": "${file}",
"console": "integratedTerminal",
Expand Down
File renamed without changes.
131 changes: 67 additions & 64 deletions auxiliary_tools/aerosol_budget.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# NOTE: This module uses the deprecated e3sm_diags.driver.utils.dataset.Dataset
# class, which was replaced by e3sm_diags.driver.utils.dataset_xr.Dataset.
import e3sm_diags
from e3sm_diags.driver import utils
import cdms2
Expand All @@ -12,11 +14,12 @@


def global_integral(var, area_m2):
""" Compute global integral of 2 dimentional properties"""
return numpy.sum(numpy.sum(abs(var)*area_m2,axis = 0), axis=0)
"""Compute global integral of 2 dimentional properties"""
return numpy.sum(numpy.sum(abs(var) * area_m2, axis=0), axis=0)


def calc_column_integral(data, aerosol, season):
""" Calculate column integrated mass """
"""Calculate column integrated mass"""

# take aerosol and change it to the appropriate string
# ncl -> SEASALT, dst -> DUST, rest1 -> REST1
Expand All @@ -32,129 +35,129 @@ def calc_column_integral(data, aerosol, season):
burden = data.get_climo_variable(f"ABURDEN{aerosol_name}", season)
except RuntimeError:
# if not, use the Mass_ terms and integrate over the column
mass = data.get_climo_variable(f'Mass_{aerosol}', season)
mass = data.get_climo_variable(f"Mass_{aerosol}", season)
hyai, hybi, ps = data.get_extra_variables_only(
f'Mass_{aerosol}', season, extra_vars=["hyai", "hybi", "PS"]
f"Mass_{aerosol}", season, extra_vars=["hyai", "hybi", "PS"]
)

p0 = 100000.0 # Pa
ps = ps # Pa
pressure_levs = cdutil.vertical.reconstructPressureFromHybrid(ps, hyai, hybi, p0)
ps = ps # Pa
pressure_levs = cdutil.vertical.reconstructPressureFromHybrid(
ps, hyai, hybi, p0
)

#(72,lat,lon)
delta_p = numpy.diff(pressure_levs,axis = 0)
mass_3d = mass*delta_p/9.8 #mass density * mass air kg/m2
burden = numpy.nansum(mass_3d,axis = 0) #kg/m2
# (72,lat,lon)
delta_p = numpy.diff(pressure_levs, axis=0)
mass_3d = mass * delta_p / 9.8 # mass density * mass air kg/m2
burden = numpy.nansum(mass_3d, axis=0) # kg/m2
return burden



def generate_metrics_dic(data, aerosol, season):
metrics_dict = {}
wetdep = data.get_climo_variable(f'{aerosol}_SFWET', season)
drydep = data.get_climo_variable(f'{aerosol}_DDF', season)
srfemis = data.get_climo_variable(f'SF{aerosol}', season)
area = data.get_extra_variables_only(
f'{aerosol}_DDF', season, extra_vars=["area"]
)
wetdep = data.get_climo_variable(f"{aerosol}_SFWET", season)
drydep = data.get_climo_variable(f"{aerosol}_DDF", season)
srfemis = data.get_climo_variable(f"SF{aerosol}", season)
area = data.get_extra_variables_only(f"{aerosol}_DDF", season, extra_vars=["area"])
area_m2 = area * REARTH**2

burden = calc_column_integral(data, aerosol, season)
burden_total= global_integral(burden, area_m2)*1e-9 # kg to Tg
print(f'{aerosol} Burden (Tg): ',f'{burden_total:.3f}')
sink = global_integral((drydep-wetdep),area_m2)*UNITS_CONV
drydep = global_integral(drydep,area_m2)*UNITS_CONV
wetdep = global_integral(wetdep,area_m2)*UNITS_CONV
srfemis = global_integral(srfemis,area_m2)*UNITS_CONV
print(f'{aerosol} Sink (Tg/year): ',f'{sink:.3f}')
print(f'{aerosol} Lifetime (days): ',f'{burden_total/sink*365:.3f}')
burden_total = global_integral(burden, area_m2) * 1e-9 # kg to Tg
print(f"{aerosol} Burden (Tg): ", f"{burden_total:.3f}")
sink = global_integral((drydep - wetdep), area_m2) * UNITS_CONV
drydep = global_integral(drydep, area_m2) * UNITS_CONV
wetdep = global_integral(wetdep, area_m2) * UNITS_CONV
srfemis = global_integral(srfemis, area_m2) * UNITS_CONV
print(f"{aerosol} Sink (Tg/year): ", f"{sink:.3f}")
print(f"{aerosol} Lifetime (days): ", f"{burden_total/sink*365:.3f}")
metrics_dict = {
"Surface Emission (Tg/yr)": f'{srfemis:.3f}',
"Sink (Tg/yr)": f'{sink:.3f}',
"Dry Deposition (Tg/yr)": f'{drydep:.3f}',
"Wet Deposition (Tg/yr)": f'{wetdep:.3f}',
"Burden (Tg)": f'{burden_total:.3f}',
"Lifetime (Days)": f'{burden_total/sink*365:.3f}',
"Surface Emission (Tg/yr)": f"{srfemis:.3f}",
"Sink (Tg/yr)": f"{sink:.3f}",
"Dry Deposition (Tg/yr)": f"{drydep:.3f}",
"Wet Deposition (Tg/yr)": f"{wetdep:.3f}",
"Burden (Tg)": f"{burden_total:.3f}",
"Lifetime (Days)": f"{burden_total/sink*365:.3f}",
}
return metrics_dict


param = CoreParameter()
param.test_name = 'v2.LR.historical_0101'
param.test_name = 'F2010.PD.NGD_v3atm.0096484.compy'
param.test_data_path = '/Users/zhang40/Documents/ACME_simulations/'
param.test_data_path = '/compyfs/mahf708/E3SMv3_dev/F2010.PD.NGD_v3atm.0096484.compy/post/atm/180x360_aave/clim/10yr'
param.test_name = "v2.LR.historical_0101"
param.test_name = "F2010.PD.NGD_v3atm.0096484.compy"
param.test_data_path = "/Users/zhang40/Documents/ACME_simulations/"
param.test_data_path = "/compyfs/mahf708/E3SMv3_dev/F2010.PD.NGD_v3atm.0096484.compy/post/atm/180x360_aave/clim/10yr"
test_data = utils.dataset.Dataset(param, test=True)

#rearth = 6.37122e6 #km
#UNITS_CONV = 86400.0*365.0*1e-9 # kg/s to Tg/yr
REARTH = 6.37122e6 #km
UNITS_CONV = 86400.0*365.0*1e-9 # kg/s to Tg/yr
# rearth = 6.37122e6 #km
# UNITS_CONV = 86400.0*365.0*1e-9 # kg/s to Tg/yr
REARTH = 6.37122e6 # km
UNITS_CONV = 86400.0 * 365.0 * 1e-9 # kg/s to Tg/yr
# TODO:
# Convert so4 unit to TgS
#mwso4 = 115.0
#mws = 32.066
#UNITS_CONV_S = UNITS_CONV/mwso4*mws # kg/s to TgS/yr
# mwso4 = 115.0
# mws = 32.066
# UNITS_CONV_S = UNITS_CONV/mwso4*mws # kg/s to TgS/yr


species = ["bc", "dst", "mom", "ncl","pom","so4","soa"]
SPECIES_NAMES = {"bc": "Black Carbon",
species = ["bc", "dst", "mom", "ncl", "pom", "so4", "soa"]
SPECIES_NAMES = {
"bc": "Black Carbon",
"dst": "Dust",
"mom": "Marine Organic Matter",
"ncl": "Sea Salt",
"pom": "Primary Organic Matter",
"so4": "Sulfate",
"soa": "Secondary Organic Aerosol"}
"soa": "Secondary Organic Aerosol",
}
MISSING_VALUE = 999.999
metrics_dict = {}
metrics_dict_ref = {}

seasons = ["ANN"]

ref_data_path = os.path.join(
e3sm_diags.INSTALL_PATH,
"control_runs",
"aerosol_global_metrics_benchmarks.json",
)
e3sm_diags.INSTALL_PATH,
"control_runs",
"aerosol_global_metrics_benchmarks.json",
)

with open(ref_data_path, 'r') as myfile:
ref_file=myfile.read()
with open(ref_data_path, "r") as myfile:
ref_file = myfile.read()

metrics_ref = json.loads(ref_file)

for season in seasons:
for aerosol in species:
print(f'Aerosol species: {aerosol}')
print(f"Aerosol species: {aerosol}")
metrics_dict[aerosol] = generate_metrics_dic(test_data, aerosol, season)
metrics_dict_ref[aerosol] = metrics_ref[aerosol]
#metrics_dict_ref[aerosol] = {
# metrics_dict_ref[aerosol] = {
# "Surface Emission (Tg/yr)": f'{MISSING_VALUE:.3f}',
# "Sink (Tg/yr)": f'{MISSING_VALUE:.3f}',
# "Dry Deposition (Tg/yr)": f'{MISSING_VALUE:.3f}',
# "Wet Deposition (Tg/yr)": f'{MISSING_VALUE:.3f}',
# "Burden (Tg)": f'{MISSING_VALUE:.3f}',
# "Lifetime (Days)": f'{MISSING_VALUE:.3f}',
# }
with open(f'aerosol_table_{season}.csv', "w") as table_csv:

with open(f"aerosol_table_{season}.csv", "w") as table_csv:
writer = csv.writer(
table_csv,
delimiter=",",
quotechar="'",
quoting=csv.QUOTE_MINIMAL,
lineterminator='\n',
lineterminator="\n",
)
#writer.writerow([" ", "test","ref",])
# writer.writerow([" ", "test","ref",])
for key, values in metrics_dict.items():
writer.writerow([SPECIES_NAMES[key]])
print('key',key, values)
print("key", key, values)
for value in values:
print(value)
line = []
line.append(value)
line.append(values[value])
line.append(metrics_dict_ref[key][value])
print(line, 'line')
print(line, "line")
writer.writerows([line])
writer.writerows([""])




Loading
Loading