Skip to content

Commit b036014

Browse files
doc/example/pre-processing updates for v2.5.0 Part1 (#477)
* improve tc track pre-processing script for obs * update IM release code * convert IBTrACS track density from 3 hourly data to 6 hourly * preprocessing to use ibtracs * docs updates * add new files * address review
1 parent 91e1376 commit b036014

File tree

4 files changed

+97
-96
lines changed

4 files changed

+97
-96
lines changed

README.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -73,3 +73,15 @@ In addition to default model versus observation comparison, the package also pro
7373

7474
<img src="misc/example_fig7.png" alt="Figure7" style="width: 280px;"/>
7575
<h5 align="center">Figure 7: An example of Taylor diagram summarizing metrics calculated based on lat-lon contour plots diagnostics of several key variables</h5>
76+
77+
## License
78+
79+
Copyright (c) 2018-2021, Energy Exascale Earth System Model Project
80+
All rights reserved
81+
82+
SPDX-License-Identifier: (BSD-3-Clause)
83+
84+
See [LICENSE](./LICENSE) for details
85+
86+
Unlimited Open Source - BSD 3-clause Distribution
87+
`LLNL-CODE-819717`

acme_diags/plot/cartopy/tc_analysis_plot.py

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -24,22 +24,25 @@
2424
border = (-0.06, -0.03, 0.13, 0.03)
2525

2626
plot_info = {}
27-
# Each key gives a list with ax extent, x ticks , y ticks, title, clevs, reference
27+
# Each key gives a list with ax extent, x ticks , y ticks, title, clevs, reference and time resolution ratio (convert 3hrly to 6hrly data, density needs to be devided by 2)
28+
# TODO flexible to apply to 3hrly model output when compare track density.
2829
plot_info["aew"] = [
2930
[182, 359, 0, 35],
3031
[240, 300],
3132
[0, 15, 30],
3233
"African Easterly Wave Density",
3334
np.arange(0, 15.1, 1),
3435
"EAR5 (2000-2014)",
36+
1,
3537
]
3638
plot_info["cyclone"] = [
3739
[0, 359, -60, 60],
3840
[0, 60, 120, 180, 240, 300, 359.99],
3941
[-60, -30, 0, 30, 60],
4042
"TC Tracks Density",
41-
np.arange(0, 1, 0.2),
43+
np.arange(0, 0.3, 0.05),
4244
"IBTrACS (1979-2018)",
45+
2,
4346
]
4447

4548

@@ -61,12 +64,11 @@ def plot_panel(n, fig, proj, var, var_num_years, region, title):
6164
ax = fig.add_axes(panel[n], projection=proj)
6265
ax.set_extent(plot_info[region][0], ccrs.PlateCarree())
6366

64-
clevs = np.arange(0, 15.1, 1)
6567
clevs = plot_info[region][4]
6668
p1 = ax.contourf(
6769
var.getLongitude(),
6870
var.getLatitude(),
69-
var / var_num_years,
71+
var / var_num_years / plot_info[region][6],
7072
transform=ccrs.PlateCarree(),
7173
levels=clevs,
7274
extend="both",
Lines changed: 66 additions & 90 deletions
Original file line numberDiff line numberDiff line change
@@ -1,107 +1,79 @@
11
#!/usr/bin/env python3
22
# -*- coding: utf-8 -*-
33
"""
4-
Created on Wed Nov 20 11:06:41 2019
4+
The observational TC data are obtained from Prof. Kerry Emanuel’s website (https://emanuel.mit.edu/products). Click “Global Tropical Cyclone Data in NETCDF format (updated through 2018)” the files will be downloaded. The TC locations are tracked 6 hourly.
55
6-
@author: bala635
7-
The observational TC data are obtained from Prof. Kerry Emanuel’s website (https://emanuel.mit.edu/products). Click “Global Tropical Cyclone Data in NETCDF format (updated through 2018)” the files will be downloaded.
6+
Another data source is IBTrACS from NOAA, which has 3hourly track data.
7+
8+
Note, The tc density maps created from both data source have identical distribution, but values are twice from IBTrACS compared to MIT due to higher time frequency. For now the TC analysis has been using 6hourly E3SM output, therefore the MIT data is being used for consistency.
89
"""
910

11+
import os
1012
import sys
1113
import warnings
14+
from datetime import datetime, timedelta
1215

1316
if not sys.warnoptions:
1417
warnings.simplefilter("ignore")
1518
import numpy as np
1619
from netCDF4 import Dataset as netcdffile
1720

18-
###################################################
19-
###################################################
20-
2121
all_lon = []
2222
all_lat = []
2323

2424
###################################################
25-
26-
nc = netcdffile('./attracks.nc')
27-
latmc = np.squeeze(nc['latmc'][:,:])
28-
longmc = np.squeeze(nc['longmc'][:,:])
29-
vsmc = np.squeeze(nc['vsmc'][:,:])
30-
yearic = np.squeeze(nc['yearic'][:])
31-
32-
for i in range(0,np.shape(latmc)[0]):
33-
for j in range(0,np.shape(latmc)[1]):
34-
35-
if yearic[j] > 1980 and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 25:
36-
37-
all_lon.append(longmc[i,j])
38-
all_lat.append(latmc[i,j])
39-
40-
###################################################
41-
42-
nc = netcdffile('./eptracks.nc')
43-
latmc = np.squeeze(nc['latmc'][:,:])
44-
longmc = np.squeeze(nc['longmc'][:,:])
45-
vsmc = np.squeeze(nc['vsmc'][:,:])
46-
yearic = np.squeeze(nc['yearic'][:])
47-
48-
for i in range(0,np.shape(latmc)[0]):
49-
for j in range(0,np.shape(latmc)[1]):
50-
51-
if yearic[j] > 1980 and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 25:
52-
53-
all_lon.append(longmc[i,j])
54-
all_lat.append(latmc[i,j])
55-
56-
###################################################
57-
58-
nc = netcdffile('./wptracks.nc')
59-
latmc = np.squeeze(nc['latmc'][:,:])
60-
longmc = np.squeeze(nc['longmc'][:,:])
61-
vsmc = np.squeeze(nc['vsmc'][:,:])
62-
yearic = np.squeeze(nc['yearic'][:])
63-
64-
for i in range(0,np.shape(latmc)[0]):
65-
for j in range(0,np.shape(latmc)[1]):
66-
67-
if yearic[j] > 1980 and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 25:
68-
69-
all_lon.append(longmc[i,j])
70-
all_lat.append(latmc[i,j])
71-
72-
###################################################
73-
74-
nc = netcdffile('./iotracks.nc')
75-
latmc = np.squeeze(nc['latmc'][:,:])
76-
longmc = np.squeeze(nc['longmc'][:,:])
77-
vsmc = np.squeeze(nc['vsmc'][:,:])
78-
yearic = np.squeeze(nc['yearic'][:])
79-
80-
for i in range(0,np.shape(latmc)[0]):
81-
for j in range(0,np.shape(latmc)[1]):
82-
83-
if yearic[j] > 1980 and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 25:
84-
85-
all_lon.append(longmc[i,j])
86-
all_lat.append(latmc[i,j])
87-
88-
###################################################
89-
90-
nc = netcdffile('./shtracks.nc')
91-
latmc = np.squeeze(nc['latmc'][:,:])
92-
longmc = np.squeeze(nc['longmc'][:,:])
93-
vsmc = np.squeeze(nc['vsmc'][:,:])
94-
yearic = np.squeeze(nc['yearic'][:])
95-
96-
for i in range(0,np.shape(latmc)[0]):
97-
for j in range(0,np.shape(latmc)[1]):
98-
99-
if yearic[j] > 1980 and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 25:
100-
101-
all_lon.append(longmc[i,j])
102-
all_lat.append(latmc[i,j])
103-
104-
###################################################
25+
origin_path = '/Users/zhang40/Documents/ACME/e3sm_tc_diags'
26+
start_yr = 1979
27+
end_yr = 2018
28+
29+
ibtracs = False #MIT data
30+
ibtracs = True
31+
32+
33+
if ibtracs:
34+
data_name = 'IBTrACS'
35+
print('Use IBTrACS 3hourly')
36+
basins = ["NA", "WP", "EP", "NI", "SI", "SP"]
37+
for basin in basins:
38+
nc = netcdffile(
39+
os.path.join(origin_path, "IBTrACS.{}.v04r00.nc".format(basin))
40+
)
41+
latmc = np.squeeze(nc["lat"][:, :])
42+
longmc = np.squeeze(nc["lon"][:, :])
43+
time = np.squeeze(nc["time"][:, :])
44+
yearic = np.zeros((np.shape(latmc)[0]))
45+
46+
for i in range(0, np.shape(time)[0]):
47+
for j in range(0, np.shape(time)[1]):
48+
if time[i, j] is not np.ma.masked:
49+
day_hurr = datetime(1858, 11, 17, 0, 0, 0) + timedelta(time[i, j])
50+
yearic[i] = day_hurr.year
51+
if yearic[i] >= start_yr and yearic[i] <= end_yr and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 0:
52+
if(longmc[i, j]<0):
53+
longmc[i, j] = 360 + longmc[i, j]
54+
all_lat.append(latmc[i, j])
55+
all_lon.append(longmc[i, j])
56+
print(len(all_lat))
57+
58+
59+
else:
60+
print('MIT 6hrly')
61+
data_name = 'MIT'
62+
basins = ['at', 'ep','wp','io','sh']
63+
for basin in basins:
64+
nc = netcdffile('{}/{}tracks.nc'.format(origin_path, basin))
65+
latmc = np.squeeze(nc['latmc'][:,:])
66+
longmc = np.squeeze(nc['longmc'][:,:])
67+
vsmc = np.squeeze(nc['vsmc'][:,:])
68+
yearic = np.squeeze(nc['yearic'][:])
69+
70+
for i in range(0,np.shape(latmc)[0]):
71+
for j in range(0,np.shape(latmc)[1]):
72+
73+
if yearic[j] >=start_yr and yearic[j]<=end_yr and np.abs(latmc[i,j]) > 0 and np.abs(longmc[i,j]) > 0:
74+
all_lon.append(longmc[i,j])
75+
all_lat.append(latmc[i,j])
76+
print(len(all_lon))
10577

10678
all_lat = np.asarray(all_lat)
10779
all_lon = np.asarray(all_lon)
@@ -111,13 +83,17 @@
11183

11284
import pandas as pd
11385

114-
raw_data = {'lon': all_lon_new,
86+
raw_data = {
87+
'lon': all_lon_new,
11588
'lat': all_lat_new}
11689

11790
df = pd.DataFrame.from_dict(raw_data)
118-
# columns = ['SST', 'STRAT','TCHP','TDY_SALT','TDY_NOSALT','INT','LAT','LON','SHR','DIV','PER','DELTA_INT']
119-
df.to_csv('cyclones_all_obs.csv', sep='\t')
91+
print(data_name)
92+
out_file = '/Users/zhang40/Documents/ACME/e3sm_tc_diags/cyclones_hist_{}_{}_{}.csv'.format(data_name,start_yr, end_yr)
93+
with open(out_file, 'w') as file:
94+
file.write('start {}\n'.format(len(all_lon)))
95+
df.to_csv(file, header=False, sep='\t')
12096

12197
###################################################
12298
# Convert the .csv file to .nc by calling tempest-extremes from command line:
123-
#tempestextremes/bin/HistogramNodes --in cyclones_all_obs.csv --iloncol 3 --ilatcol 4 --out cyclones_all_obs.nc
99+
#tempestextremes/bin/HistogramNodes --in cyclones_hist_IBTrACS_1979_2018.csv --iloncol 2 --ilatcol 3 --out cyclones_hist_IBTrACS_1979_2018.nc

docs/source/index.rst

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ Algorithm and visualization codes for **latitude-longitude contour maps**,
6464
**polar contour maps**, the accompanying **summarizing table** and **Taylor diagram plots**, **pressure-latitude zonal mean contour plots**,
6565
**zonal mean line plots**, **pressure-longitude meridional mean contour plots**, **area mean time series plots**, and **Cloud Top Height-Tau** joint histograms
6666
from COSP cloud simulator output. Plots can be created for annual
67-
and seasonal climatologies, and monthly mean time series.
67+
and seasonal climatologies, and monthly mean time series. In additional to the core sets being released in v1, **ENSO diags**, **QBO diags**, **Diurnal cycle phase plot**, **Streamflow evaluation**, **ARM diags**, and **TC analysis** are implemented in v2 release.
6868

6969
The package also supports custom user diagnostics, by specifying
7070
plot type, desired region (global, ocean, land, etc.),
@@ -139,10 +139,21 @@ Additional back-ends could be implemented if the need arose.
139139
| | |
140140
| ARM diagnostics monthly diurnal cycle of cloud plot | ARM diagnostics convection onset statistics plot |
141141
+--------------------------------------------------------+------------------------------------------------------+
142+
| .. figure:: _static/index/fig21.png | .. figure:: _static/index/fig22.png |
143+
| :align: center | :align: center |
144+
| :target: _static/index/fig21.png | :target: _static/index/fig22.png |
145+
| | |
146+
| Tropical Cyclone Track Density | Annual Cycle Zonel Mean plot |
147+
+--------------------------------------------------------+------------------------------------------------------+
148+
| .. figure:: _static/index/fig23.png | .. figure:: _static/index/fig24.png |
149+
| :align: center | :align: center |
150+
| :target: _static/index/fig23.png | :target: _static/index/fig24.png |
151+
| | |
152+
| Tropical Cyclone frequency per basin | Per-basin Tropical Cyclone frac seasonal cycle |
153+
+--------------------------------------------------------+------------------------------------------------------+
142154

143155
The above plots and more can be found
144156
`here <https://portal.nersc.gov/cfs/e3sm/zhang40/tutorials/run_v230_allsets/viewer/>`_.
145-
ARM diagnostics plots can be found `here <https://web.lcrc.anl.gov/public/e3sm/e3sm_diags_test_data/unit_test_complete_run/expected/all_sets/viewer/arm_diags/>`_.
146157

147158
Feature availability for each backend
148159
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

0 commit comments

Comments
 (0)