From 25b544ed6b75dd27ef0faa518dcfe740da544fe1 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:05:54 -0600 Subject: [PATCH 01/97] Docs UG Windows: Add NCAR machine instructions. --- .../building-docs-prereqs-windows.md | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index ab972cdfc4..5781661dc2 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -8,7 +8,17 @@ Note that you may need administrator privileges on your PC (or approval from you ## Install Linux subsystem -We don't support building our documentation in the native Windows command-line environment. Thus, you will need to install a little version of Linux inside a virtual machine (VM) to use instead. +We don't support building our documentation in the native Windows command-line environment. Thus, you will need to install a little version of Linux inside a virtual machine (VM) to use instead. The process for doing this varies depending on how tightly the installation process is controlled on your computer. + +### NCAR computers + +Please follow the [Windows Subsystem for Linux (WSL) setup instructions](https://wiki.ucar.edu/pages/viewpage.action?pageId=514032264&spaceKey=CONFIGMGMT&title=Setup) on the UCAR Wiki. In the step about installing a Linux distribution, choose Ubuntu. + +Feel free to peruse the [overall WSL documentation](https://wiki.ucar.edu/spaces/CONFIGMGMT/pages/514032242/Windows+Subsystem+for+Linux) on and linked from the UCAR Wiki for additional information. + +### Non-NCAR computers + +If your computer is managed by an organization other than NCAR, please check with your IT department or equivalent for instructions on installing Windows Subsystem for Linux (WSL) and Ubuntu. Otherwise, follow these instructions: 1. Download and install Ubuntu from the Microsoft Store. 1. Restart your computer. From 040301404f390031085e96105bb5078937f87295 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:06:14 -0600 Subject: [PATCH 02/97] Docs UG Windows: Mention username/password step. --- .../working-with-documentation/building-docs-prereqs-windows.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 5781661dc2..29393eb033 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -26,6 +26,8 @@ If your computer is managed by an organization other than NCAR, please check wit If Ubuntu opens in that last step but you see an error, you may need to manually enable Windows Subsystem for Linux (WSL). To do so: Open Control Panel, go to "Programs" > "Programs and Features" > "Turn Windows features on or off". Check the box next to "Windows Subsystem for Linux" and click OK. +Once Ubuntu is working and open, you'll be asked to create a new UNIX username and password. This doesn't have to match your Windows username and password, but do make sure to save this information somewhere secure. + .. _windows-docs-ubuntu-utilities: ## Install utilities From 3b56e36ac6a007098adbbe9739fb3b1797fcb732 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:11:48 -0600 Subject: [PATCH 03/97] Docs UG Windows: Recommend wslview instead of chromium. Opens in a Windows browser, avoiding the annoying warning messages. --- .../building-docs-prereqs-windows.md | 5 ++--- .../docs-intro-and-recommended.md | 10 +++++++--- 2 files changed, 9 insertions(+), 6 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 29393eb033..0bf7e6d2eb 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -43,9 +43,8 @@ which make || sudo apt-get -y install make which git || sudo apt-get -y install git which git-lfs || sudo apt-get -y install git-lfs -# Chromium: A web browser engine that's the basis for popular browsers like Google -# Chrome and Microsoft Edge -which chromium || sudo apt-get -y install chromium +# WSL utilities, which will give us the wslview command for opening HTML pages in a Windows browser +which wslview || sudo apt-get -y install wslu ``` .. _container-or-conda-windows: diff --git a/doc/source/users_guide/working-with-documentation/docs-intro-and-recommended.md b/doc/source/users_guide/working-with-documentation/docs-intro-and-recommended.md index 1501f8d48a..bfc537f223 100644 --- a/doc/source/users_guide/working-with-documentation/docs-intro-and-recommended.md +++ b/doc/source/users_guide/working-with-documentation/docs-intro-and-recommended.md @@ -55,8 +55,12 @@ open _build/html/index.html ### Windows (Ubuntu VM) -Assuming you installed Chromium in the :ref:`windows-docs-ubuntu-utilities` setup step, you can open your build of the documentation like so: +Assuming you installed the WSL Utilities in the :ref:`windows-docs-ubuntu-utilities` setup step, you can open your build of the documentation like so: ```shell -chromium _build/html/index.html & +wslview _build/html/index.html ``` -This will generate a lot of warnings in the terminal that seem to be inconsequential to our purpose here. You may need to press Ctrl-C and/or Enter a few times to clear them and return your cursor to the prompt. +If you didn't, you can do +```shell +explorer.exe $(wslpath -w _build/html/index.html) +``` +These both do the same thing, but the `wslview` method is simpler. Either way, at least the first time you do this, it will open a window asking which app you'd like to view the HTML file in. Choose a browser like Microsoft Edge or Chrome. At the bottom of the window, you can then choose whether you always want to open HTML files using the selected app or just this once. From 4238ee0dc5020a3f749270635fdc83c6359504d7 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:12:18 -0600 Subject: [PATCH 04/97] Docs UG Windows: Use Docker instead of Podman. Resolves ESCOMP/CTSM#3185. --- .../building-docs-prereqs-windows.md | 18 ++++++++++++++---- 1 file changed, 14 insertions(+), 4 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 0bf7e6d2eb..f61b43592a 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -53,11 +53,21 @@ which wslview || sudo apt-get -y install wslu We recommend building the software in what's called a container—basically a tiny little operating system with just some apps and utilities needed by the doc-building process. This is nice because, if we change the doc-building process in ways that require new versions of those apps and utilities, that will be completely invisible to you. You won't need to manually do anything to update your setup to work with the new process; it'll just happen automatically. -We recommend using the container software Podman. +For builds in WSL (Ubuntu), we recommend using the container software Docker. You can install it in Ubuntu like so: -1. Install Podman with `sudo apt-get -y install podman`. -1. Set up and start a Podman "virtual machine" with `podman machine init --now`. -1. Test your installation by doing `podman run --rm hello-world`. If it worked, you should see ASCII art of the Podman logo. +```shell +# If needed, download and run the Docker installation script. +# Ignore the message saying "We recommend using Docker Desktop for Windows." The script will make you wait 20 seconds to make sure this is want you want, and then it should continue automatically. +which docker || curl -fsSL https://get.docker.com -o get-docker.sh && sudo sh ./get-docker.sh + +# Set up the docker "group," if needed, and add your username to it. +sudo groupadd docker # Create docker group if it doesn't exist +sudo usermod -aG docker $USER # Add your user to the docker group +newgrp docker # Apply the new group membership (avoids needing to log out and back in) + +# Make sure it worked: This should print a "Hello from Docker!" message +docker run hello-world +``` You may not be able to install Podman or any other containerization software, so there is an alternative method: a Conda environment. From dcedb18894bfa8a845c4eb69b4015889287155ad Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:13:39 -0600 Subject: [PATCH 05/97] Docs UG Windows: Move chown instructions. I didn't run into a need for this, so put it in a new Troubleshooting section. --- .../building-docs-prereqs-windows.md | 16 +++++++++------- 1 file changed, 9 insertions(+), 7 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index f61b43592a..219092cc93 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -74,13 +74,6 @@ You may not be able to install Podman or any other containerization software, so 1. Check whether you already have Conda installed by doing `which conda`. If that doesn't print anything, [install Miniconda](https://www.anaconda.com/docs/getting-started/miniconda/install#linux). 1. Follow the instructions for setting up the `ctsm_pylib` Conda environment in Sect. :numref:`using-ctsm-pylib`. - -## Set up your permissions -This will make sure that you "own" your home directory in the Ubuntu VM. **In your Ubuntu terminal**, do: -```shell -chown -R $USER:$USER $HOME -``` - .. _editing-text-files-wsl: ## Editing text files in an Ubuntu VM @@ -102,3 +95,12 @@ You can also install a user-friendly text editor in Ubuntu. This may be slower a - [VS Code](https://code.visualstudio.com/) (if you don't already have it installed on Windows): `sudo snap install code --classic` You can use all of those to open and edit files, but Kate and VS Code let you open entire folders, which can be convenient. In any case, you'd do `EDITOR_NAME path/to/thing/youre/editing` to open it, where `EDITOR_NAME` is `gedit`, `kate`, or `code`, respectively. + +## Troubleshooting + +### "Permission denied" error + +If you get this error, you may need to remind Linux that you do actually own your files. **In your Ubuntu terminal**, do: +```shell +chown -R $USER:$USER $HOME +``` \ No newline at end of file From f1673436f898938f0aced8ea8030a05aabc8c26a Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:16:07 -0600 Subject: [PATCH 06/97] Docs UG Windows: Change a Podman ref to Docker. --- .../working-with-documentation/building-docs-prereqs-windows.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 219092cc93..06b815c8db 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -69,7 +69,7 @@ newgrp docker # Apply the new group membership (avoids needing to log out and b docker run hello-world ``` -You may not be able to install Podman or any other containerization software, so there is an alternative method: a Conda environment. +You may not be able to install Docker or any other containerization software, so there is an alternative method: a Conda environment. 1. Check whether you already have Conda installed by doing `which conda`. If that doesn't print anything, [install Miniconda](https://www.anaconda.com/docs/getting-started/miniconda/install#linux). 1. Follow the instructions for setting up the `ctsm_pylib` Conda environment in Sect. :numref:`using-ctsm-pylib`. From b16bb421a760527a0438ed9bed0b0f15448fe7f4 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:19:18 -0600 Subject: [PATCH 07/97] Docs UG Windows: Suggest not running Ubuntu as admin. --- .../building-docs-prereqs-windows.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 06b815c8db..71085bc6cb 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -100,7 +100,9 @@ You can use all of those to open and edit files, but Kate and VS Code let you op ### "Permission denied" error -If you get this error, you may need to remind Linux that you do actually own your files. **In your Ubuntu terminal**, do: +If you get this error, it may be a result of opening Ubuntu as an administrator (e.g., by right-clicking on its icon and choosing "Run as administrator.") Try not doing that, although this will result in you needing to get a new copy of CTSM to work in. + +If that's not feasible or doesn't solve the problem, you may need to remind Linux that you do actually own your files. **In your Ubuntu terminal**, do: ```shell chown -R $USER:$USER $HOME -``` \ No newline at end of file +``` From c59268d222a0e690620fe5f5ebe5f855dc796437 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:28:34 -0600 Subject: [PATCH 08/97] Docs UG Windows: Syntax fix (?) in Docker instructions. --- .../building-docs-prereqs-windows.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 71085bc6cb..e7f754d006 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -58,7 +58,8 @@ For builds in WSL (Ubuntu), we recommend using the container software Docker. Yo ```shell # If needed, download and run the Docker installation script. # Ignore the message saying "We recommend using Docker Desktop for Windows." The script will make you wait 20 seconds to make sure this is want you want, and then it should continue automatically. -which docker || curl -fsSL https://get.docker.com -o get-docker.sh && sudo sh ./get-docker.sh +which docker || curl -fsSL https://get.docker.com -o get-docker.sh +which docker || sudo sh ./get-docker.sh # Set up the docker "group," if needed, and add your username to it. sudo groupadd docker # Create docker group if it doesn't exist From dc92b3eeb94f20092f3bf69402de3746184488fc Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:30:39 -0600 Subject: [PATCH 09/97] Docs UG multiple versions: Linux Docker works. --- .../building-docs-multiple-versions.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst index a127103d0e..96803d8254 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst +++ b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst @@ -12,7 +12,7 @@ Note that this is not necessary in order for you to contribute an update to the :end-before: VERSION LINKS WILL NOT RESOLVE :append: open _publish/index.html -**Note:** This is not yet supported with Podman on Linux (including Ubuntu VM on Windows). See `doc-builder Issue #27: build_docs_to_publish fails on Linux (maybe just Ubuntu?) with Podman `_. +**Note:** This is not yet supported with Podman on Linux (including Ubuntu VM on Windows). See `doc-builder Issue #27: build_docs_to_publish fails on Linux (maybe just Ubuntu?) with Podman `_. It does work with Docker on Linux, though. How this works From 6a1a554dc4d001656c0d4e2fa2ae919909532f6f Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:31:23 -0600 Subject: [PATCH 10/97] Docs UG Windows: Use wslview or explorer.exe to open files for edit. --- .../building-docs-prereqs-windows.md | 21 +++++++++++++++---- 1 file changed, 17 insertions(+), 4 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index e7f754d006..8ac75e2025 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -77,12 +77,23 @@ You may not be able to install Docker or any other containerization software, so .. _editing-text-files-wsl: -## Editing text files in an Ubuntu VM -If you prefer using an old-school text editor like `vim`, it's probably already installed, or can be installed with `sudo apt-get -y install EDITOR_NAME`. If you prefer a more user-friendly interface, there are several options. +## Editing documentation files +If you prefer using an old-school text editor like `vim`, it's probably already installed in your Ubuntu VM, or can be installed with `sudo apt-get -y install EDITOR_NAME`. If you prefer a more user-friendly interface, there are several options. Note that **all commands in this section are to be run in your Ubuntu VM, not a Windows terminal**. -You may be able to edit files in your Ubuntu VM in the Ubuntu terminal by using the name of the Windows executable. For Notepad, for instance, you would do +### In a Windows app (recommended) +If you installed `wslview` in the instructions above, you can edit files by doing ```shell -notepad.exe file_i_want_to_edit.rst +wslview path/to/file_i_want_to_edit.rst +``` +If not, you can do +```shell +explorer.exe $(wslpath -w path/to/file_i_want_to_edit.rst) +``` +These both do the same thing, but the `wslview` method is simpler. Either way, at least the first time you do this, it will open a window asking which app you'd like to open the file in. Choose whatever you're most comfortable with. At the bottom of the window, you can then choose whether you always want to open HTML files using the selected app or just this once. + +You may also be able to edit files in your Ubuntu VM in the Ubuntu terminal by using the name of the Windows executable. For Notepad, for instance, you would do +```shell +notepad.exe $(wslpath -w path/to/file_i_want_to_edit.rst) ``` If you use [VS Code](https://code.visualstudio.com/), you can install the [WSL VS Code extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-wsl). Then you can open any file or folder in your Ubuntu VM by doing @@ -90,6 +101,8 @@ If you use [VS Code](https://code.visualstudio.com/), you can install the [WSL V code path/to/file-or-folder ``` +### In an Ubuntu app (not recommended) + You can also install a user-friendly text editor in Ubuntu. This may be slower and have unexpected differences in behavior from what you expect from Windows apps, but it does work. For example: - [gedit](https://gedit-text-editor.org/): `sudo apt-get install -y gedit` - [Kate](https://kate-editor.org/): `sudo apt-get install -y kate` From a049245cffb212f9031a0aaf1215ae19f054d453 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:34:25 -0600 Subject: [PATCH 11/97] Docs UG WIndows: Add some line breaks in code block. --- .../building-docs-prereqs-windows.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 8ac75e2025..e615397b91 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -57,7 +57,9 @@ For builds in WSL (Ubuntu), we recommend using the container software Docker. Yo ```shell # If needed, download and run the Docker installation script. -# Ignore the message saying "We recommend using Docker Desktop for Windows." The script will make you wait 20 seconds to make sure this is want you want, and then it should continue automatically. +# Ignore the message saying "We recommend using Docker Desktop for Windows." +# The script will make you wait 20 seconds to make sure this is want you want, +# and then it should continue automatically. which docker || curl -fsSL https://get.docker.com -o get-docker.sh which docker || sudo sh ./get-docker.sh From 209509e69db90d8dc523930c5c18b78ab0203986 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:45:23 -0600 Subject: [PATCH 12/97] Docs UG Windows: Add troubleshooting subsection. "The host 'wsl$' was not found in the list of allowed hosts" --- .../building-docs-prereqs-windows.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index e615397b91..582543f3cb 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -122,3 +122,7 @@ If that's not feasible or doesn't solve the problem, you may need to remind Linu ```shell chown -R $USER:$USER $HOME ``` + +### "The host 'wsl$' was not found in the list of allowed hosts" + +You may see this warning in a dialog box after trying to open a file with `wslview`, `explorer.exe`, or something else. Check "Permanently allow host 'wsl$'" and then press "Allow". From 0f6b805b4fd1b56da1ef92e27dbc3e252eadb686 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 4 Jun 2025 16:45:35 -0600 Subject: [PATCH 13/97] Docs UG Windows: Rephrase. --- .../building-docs-prereqs-windows.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index 582543f3cb..aa4b537e58 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -93,12 +93,12 @@ explorer.exe $(wslpath -w path/to/file_i_want_to_edit.rst) ``` These both do the same thing, but the `wslview` method is simpler. Either way, at least the first time you do this, it will open a window asking which app you'd like to open the file in. Choose whatever you're most comfortable with. At the bottom of the window, you can then choose whether you always want to open HTML files using the selected app or just this once. -You may also be able to edit files in your Ubuntu VM in the Ubuntu terminal by using the name of the Windows executable. For Notepad, for instance, you would do +You may also be able to open files in Windows apps by using the name of the Windows executable. For Notepad, for instance, you would do ```shell notepad.exe $(wslpath -w path/to/file_i_want_to_edit.rst) ``` -If you use [VS Code](https://code.visualstudio.com/), you can install the [WSL VS Code extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-wsl). Then you can open any file or folder in your Ubuntu VM by doing +If you use [VS Code](https://code.visualstudio.com/), you can install the [WSL VS Code extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-wsl). Then (after closing and re-opening Ubuntu) you can open any documentation file **or folder** by doing ```shell code path/to/file-or-folder ``` From 8c30d1550c8f3c148a9b9722954261fa66f8a384 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 14:29:24 -0600 Subject: [PATCH 14/97] docs: Move testing.sh to its own dir. --- doc/{ => testing}/testing.sh | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename doc/{ => testing}/testing.sh (100%) diff --git a/doc/testing.sh b/doc/testing/testing.sh similarity index 100% rename from doc/testing.sh rename to doc/testing/testing.sh From c2600a1d6e9eb112ebbe372ab2cb60c368c6efe5 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 14:32:29 -0600 Subject: [PATCH 15/97] testing.sh: cd to script dir. --- doc/testing/testing.sh | 3 +++ 1 file changed, 3 insertions(+) diff --git a/doc/testing/testing.sh b/doc/testing/testing.sh index a44b6ad18a..83c9037ecc 100755 --- a/doc/testing/testing.sh +++ b/doc/testing/testing.sh @@ -2,6 +2,9 @@ set -e set -x +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}" + rm -rf _publish* # Build all docs using container From 244da8330f643c9363cbcdd55a506e8f7158592e Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:17:24 -0600 Subject: [PATCH 16/97] Break up testing.sh into subscripts. --- .../building-docs-multiple-versions.rst | 2 +- doc/testing/compose_test_cmd.sh | 13 +++++ doc/testing/test_build_docs_-b.sh | 16 ++++++ doc/testing/test_build_docs_-r-v.sh | 16 ++++++ doc/testing/test_container_eq_ctsm_pylib.sh | 35 ++++++++++++ doc/testing/test_doc-builder_tests.sh | 12 ++++ doc/testing/test_makefile_method.sh | 13 +++++ doc/testing/testing.sh | 56 +++++-------------- 8 files changed, 121 insertions(+), 42 deletions(-) create mode 100755 doc/testing/compose_test_cmd.sh create mode 100755 doc/testing/test_build_docs_-b.sh create mode 100755 doc/testing/test_build_docs_-r-v.sh create mode 100755 doc/testing/test_container_eq_ctsm_pylib.sh create mode 100755 doc/testing/test_doc-builder_tests.sh create mode 100755 doc/testing/test_makefile_method.sh diff --git a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst index a127103d0e..f7a60d3dab 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst +++ b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst @@ -7,7 +7,7 @@ There is a menu in the lower left of the webpage that lets readers switch betwee Note that this is not necessary in order for you to contribute an update to the documentation. GitHub will test this automatically when you open a PR. But if you'd like to try, this will generate a local site for you in ``_publish/`` and then open it: -.. literalinclude:: ../../../testing.sh +.. literalinclude:: ../../../testing/test_container_eq_ctsm_pylib.sh :start-at: ./build_docs_to_publish :end-before: VERSION LINKS WILL NOT RESOLVE :append: open _publish/index.html diff --git a/doc/testing/compose_test_cmd.sh b/doc/testing/compose_test_cmd.sh new file mode 100755 index 0000000000..3d010f166d --- /dev/null +++ b/doc/testing/compose_test_cmd.sh @@ -0,0 +1,13 @@ +# This should only be run locally within another shell + +if [[ "${cli_tool}" == "" ]]; then + echo "${msg} (no container)" +else + cmd="${cmd} -d" + if [[ "${cli_tool}" != "default" ]]; then + cmd="${cmd} --container-cli-tool ${cli_tool}" + fi + echo "${msg} (container: ${cli_tool})" +fi + +echo cmd \ No newline at end of file diff --git a/doc/testing/test_build_docs_-b.sh b/doc/testing/test_build_docs_-b.sh new file mode 100755 index 0000000000..00781685b0 --- /dev/null +++ b/doc/testing/test_build_docs_-b.sh @@ -0,0 +1,16 @@ +#!/usr/bin/env bash +set -e + +cli_tool="$1" + +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}/.." + +msg="~~~~~ Check that -b works" +cmd="./build_docs -b _build -c" + +. testing/compose_test_cmd.sh +set -x +$cmd + +exit 0 diff --git a/doc/testing/test_build_docs_-r-v.sh b/doc/testing/test_build_docs_-r-v.sh new file mode 100755 index 0000000000..267e52e53c --- /dev/null +++ b/doc/testing/test_build_docs_-r-v.sh @@ -0,0 +1,16 @@ +#!/usr/bin/env bash +set -e + +cli_tool="$1" + +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}/.." + +msg="~~~~~ Check that -r -v works" +cmd="./build_docs -r _build -v latest -c --conf-py-path doc-builder/test/conf.py --static-path ../_static --templates-path ../_templates" + +. testing/compose_test_cmd.sh +set -x +$cmd + +exit 0 diff --git a/doc/testing/test_container_eq_ctsm_pylib.sh b/doc/testing/test_container_eq_ctsm_pylib.sh new file mode 100755 index 0000000000..44185aa9fb --- /dev/null +++ b/doc/testing/test_container_eq_ctsm_pylib.sh @@ -0,0 +1,35 @@ +#!/usr/bin/env bash +set -e + +# Compare docs built with container vs. ctsm_pylib + +cli_tool="$1" + +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}/.." + +rm -rf _publish* + +# Build all docs using container +echo "~~~~~ Build all docs using container" +# Also do a custom --conf-py-path +rm -rf _build _publish +d1="$PWD/_publish_container" +./build_docs_to_publish -r _build -d --site-root "$PWD/_publish" +# VERSION LINKS WILL NOT RESOLVE IN _publish_container +cp -a _publish "${d1}" + +# Build all docs using ctsm_pylib +echo "~~~~~ Build all docs using ctsm_pylib" +rm -rf _build _publish +d2="$PWD/_publish_nocontainer" +conda run -n ctsm_pylib --no-capture-output ./build_docs_to_publish -r _build --site-root "$PWD/_publish" --conf-py-path doc-builder/test/conf.py --static-path ../_static --templates-path ../_templates +# VERSION LINKS WILL NOT RESOLVE IN _publish_nocontainer +cp -a _publish "${d2}" + +# Make sure container version is identical to no-container version +echo "~~~~~ Make sure container version is identical to no-container version" +diff -qr "${d1}" "${d2}" +echo "Yep!" + +exit 0 diff --git a/doc/testing/test_doc-builder_tests.sh b/doc/testing/test_doc-builder_tests.sh new file mode 100755 index 0000000000..00d576c298 --- /dev/null +++ b/doc/testing/test_doc-builder_tests.sh @@ -0,0 +1,12 @@ +#!/usr/bin/env bash +set -e + +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}" + +echo "~~~~~ Check that doc-builder tests pass" +cd ../doc-builder/test +set -x +conda run --no-capture-output -n ctsm_pylib make test + +exit 0 \ No newline at end of file diff --git a/doc/testing/test_makefile_method.sh b/doc/testing/test_makefile_method.sh new file mode 100755 index 0000000000..dd62e770a4 --- /dev/null +++ b/doc/testing/test_makefile_method.sh @@ -0,0 +1,13 @@ +#!/usr/bin/env bash +set -e + +cli_tool="$1" + +SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" +cd "${SCRIPT_DIR}/.." + +echo "~~~~~ Check that Makefile method works" +set -x +make SPHINXOPTS="-W --keep-going" BUILDDIR=${PWD}/_build html + +exit 0 diff --git a/doc/testing/testing.sh b/doc/testing/testing.sh index 83c9037ecc..1e009362b0 100755 --- a/doc/testing/testing.sh +++ b/doc/testing/testing.sh @@ -1,55 +1,29 @@ -#!/bin/bash +#!/usr/bin/env bash set -e -set -x SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" -cd "${SCRIPT_DIR}" +cd "${SCRIPT_DIR}/" -rm -rf _publish* +# Compare docs built with container vs. ctsm_pylib +./test_container_eq_ctsm_pylib.sh -# Build all docs using container -echo "~~~~~ Build all docs using container" -# Also do a custom --conf-py-path -rm -rf _build _publish -d1="$PWD/_publish_container" -./build_docs_to_publish -r _build -d --site-root "$PWD/_publish" -# VERSION LINKS WILL NOT RESOLVE IN _publish_container -cp -a _publish "${d1}" - -# Build all docs using ctsm_pylib -echo "~~~~~ Build all docs using ctsm_pylib" -rm -rf _build _publish -d2="$PWD/_publish_nocontainer" -conda run -n ctsm_pylib ./build_docs_to_publish -r _build --site-root "$PWD/_publish" --conf-py-path doc-builder/test/conf.py --static-path ../_static --templates-path ../_templates -# VERSION LINKS WILL NOT RESOLVE IN _publish_nocontainer -cp -a _publish "${d2}" - -# Make sure container version is identical to no-container version -echo "~~~~~ Make sure container version is identical to no-container version" -diff -qr "${d1}" "${d2}" - -# Check that -r -v works -echo "~~~~~ Check that -r -v works (Docker)" -# Also do a custom --conf-py-path -rm -rf _build_container -./build_docs -r _build_container -v latest -d -c --conf-py-path doc-builder/test/conf.py --static-path ../_static --templates-path ../_templates --container-cli-tool docker +# ✅ Check that -r -v works (Docker) +# Also do a custom --conf-py-path and other stuff +rm -rf _build +./test_build_docs_-r-v.sh docker -# Check that Makefile method works -echo "~~~~~ Check that Makefile method works" +# ✅ Check that Makefile method works rm -rf _build -make SPHINXOPTS="-W --keep-going" BUILDDIR=${PWD}/_build html +conda run --no-capture-output -n ctsm_pylib ./test_makefile_method.sh -# Check that -b works -echo "~~~~~ Check that -b works (Podman)" -rm -rf _build_container -./build_docs -b _build_container -d -c --container-cli-tool docker +# ✅ Check that -b works +rm -rf _build +./test_build_docs_-b.sh docker -# Check that doc-builder tests pass +# ✅ Check that doc-builder tests pass # Don't run if on a GitHub runner; failing 🤷. Trust that doc-builder does this test. if [[ "${GITHUB_ACTIONS}" == "" ]]; then - echo "~~~~~ Check that doc-builder tests pass" - cd doc-builder/test - conda run -n ctsm_pylib make test + ./test_doc-builder_tests.sh fi exit 0 \ No newline at end of file From d15df4a8550ad7c1e52b3ef19be012408ffdfe55 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:25:29 -0600 Subject: [PATCH 17/97] docs-omnibus.yml: Use new script location. --- .github/workflows/docs-omnibus.yml | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index 655813c366..d8637cafe8 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -48,7 +48,6 @@ jobs: channels: conda-forge auto-activate-base: false - # TODO: Split testing.sh tests into their own steps in this job - name: Text Sphinx builds with omnibus script run: | - cd doc && conda run -n ctsm_pylib ./testing.sh + cd doc/testing && ./testing.sh From 4b375579a35dec2ce2ce451e930a5b4236197772 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:25:44 -0600 Subject: [PATCH 18/97] docs-omnibus.yml: Check out all submodules. --- .github/workflows/docs-omnibus.yml | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index d8637cafe8..3cc2fcaf7d 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -35,9 +35,10 @@ jobs: fetch-depth: 0 lfs: true - - name: Checkout doc-builder external + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o # Set up conda - name: Set up conda environment From fe021848fc26cfb62b0fe57ba03675241daf3812 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:26:08 -0600 Subject: [PATCH 19/97] docs-omnibus.yml: Only run one of the testing scripts is changed. --- .github/workflows/docs-omnibus.yml | 16 ++-------------- 1 file changed, 2 insertions(+), 14 deletions(-) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index 3cc2fcaf7d..4e549a574d 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -5,24 +5,12 @@ on: # Run when a change to these files is pushed to any branch. Without the "branches:" line, for some reason this will be run whenever a tag is pushed, even if the listed files aren't changed. branches: ['*'] paths: - - 'doc/**' - - '!doc/*ChangeLog*' - - '!doc/*ChangeSum*' - - '!doc/UpdateChangelog.pl' - # Include all include::ed files outside doc/ directory! - - 'src/README.unit_testing' - - 'tools/README' + - 'doc/testing/*' pull_request: # Run on pull requests that change the listed files paths: - - 'doc/**' - - '!doc/*ChangeLog*' - - '!doc/*ChangeSum*' - - '!doc/UpdateChangelog.pl' - # Include all include::ed files outside doc/ directory! - - 'src/README.unit_testing' - - 'tools/README' + - 'doc/testing/*' workflow_dispatch: From 628d3f9bec0768751b940d42f8e63b9aa72432f1 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:46:55 -0600 Subject: [PATCH 20/97] docs-omnibus.yml: Also run when docs/Makefile is updated. --- .github/workflows/docs-omnibus.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index 4e549a574d..fc0973cfc8 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -6,11 +6,13 @@ on: branches: ['*'] paths: - 'doc/testing/*' + - 'doc/Makefile' pull_request: # Run on pull requests that change the listed files paths: - 'doc/testing/*' + - 'doc/Makefile' workflow_dispatch: From e86ffe505e2c6819610238d83dc1b0824cdab875 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:47:32 -0600 Subject: [PATCH 21/97] docs-ctsm_pylib.yml: Also trigger on docs container requirements.txt. --- .github/workflows/docs-ctsm_pylib.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/.github/workflows/docs-ctsm_pylib.yml b/.github/workflows/docs-ctsm_pylib.yml index 850f58063f..280e236290 100644 --- a/.github/workflows/docs-ctsm_pylib.yml +++ b/.github/workflows/docs-ctsm_pylib.yml @@ -6,12 +6,14 @@ on: branches: ['*'] paths: - 'python/conda_env_ctsm_py.txt' + - 'doc/ctsm-docs_container/requirements.txt' - '.github/workflows/docs-common.yml' pull_request: # Run on pull requests that change the listed files paths: - 'python/conda_env_ctsm_py.txt' + - 'doc/ctsm-docs_container/requirements.txt' - '.github/workflows/docs-common.yml' schedule: From 3631ee0fb3541ff378f18241adc997061103b1af Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:47:51 -0600 Subject: [PATCH 22/97] Add GitHub workflows to run when docs dependencies update. --- .github/workflows/docs-update-ctsm_pylib.yml | 30 +++++++++ .../docs-update-dependency-common.yml | 67 +++++++++++++++++++ .github/workflows/docs-update-doc-builder.yml | 36 ++++++++++ 3 files changed, 133 insertions(+) create mode 100644 .github/workflows/docs-update-ctsm_pylib.yml create mode 100644 .github/workflows/docs-update-dependency-common.yml create mode 100644 .github/workflows/docs-update-doc-builder.yml diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml new file mode 100644 index 0000000000..789288311d --- /dev/null +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -0,0 +1,30 @@ +name: Docs tests to run when ctsm_pylib is updated + +on: + push: + # Run when a change to these files is pushed to any branch. Without the "branches:" line, for some reason this will be run whenever a tag is pushed, even if the listed files aren't changed. + branches: ['*'] + paths: + - 'python/conda_env_ctsm_py.txt' + - 'doc/ctsm-docs_container/requirements.txt' + - '.github/workflows/docs-update-dependency-common.yml' + + pull_request: + # Run on pull requests that change the listed files + paths: + - 'python/conda_env_ctsm_py.txt' + - 'doc/ctsm-docs_container/requirements.txt' + - '.github/workflows/docs-update-dependency-common.yml' + + workflow_dispatch: + +permissions: + contents: read +jobs: + test-update-dependency: + if: ${{ always() }} + name: Tests to run when either docs dependency is updated + uses: ./.github/workflows/docs-update-dependency-common.yml + with: + conda_env_file: python/conda_env_ctsm_py.yml + conda_env_name: ctsm_pylib diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml new file mode 100644 index 0000000000..a7e815a27b --- /dev/null +++ b/.github/workflows/docs-update-dependency-common.yml @@ -0,0 +1,67 @@ +name: Jobs shared by docs workflows that run when a dependency is updated + +on: + workflow_call: + inputs: + conda_env_file: + required: false + type: string + default: "" + conda_env_name: + required: false + type: string + default: "" + secrets: {} + +jobs: + compare-docbuilder-vs-ctsmpylib: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + with: + fetch-depth: 0 + lfs: true + + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules + run: | + bin/git-fleximod update doc-builder + + - name: Set up conda environment + uses: conda-incubator/setup-miniconda@v3 + with: + activate-environment: ${{ inputs.conda_env_name }} + environment-file: ${{ inputs.conda_env_file }} + channels: conda-forge + auto-activate-base: false + + - name: Compare docs built with container vs. ctsm_pylib + run: | + cd doc/testing + ./test_container_eq_ctsm_pylib.sh + + makefile-method: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + with: + fetch-depth: 0 + lfs: true + + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules + run: | + bin/git-fleximod update doc-builder + + - name: Set up conda environment + uses: conda-incubator/setup-miniconda@v3 + with: + activate-environment: ${{ inputs.conda_env_name }} + environment-file: ${{ inputs.conda_env_file }} + channels: conda-forge + auto-activate-base: false + + - name: Check that Makefile method works + run: | + cd doc/testing + conda run -n ${{ inputs.conda_env_name }} --no-capture-output ./test_makefile_method.sh diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml new file mode 100644 index 0000000000..cf4422a220 --- /dev/null +++ b/.github/workflows/docs-update-doc-builder.yml @@ -0,0 +1,36 @@ +name: Docs tests to run when doc-builder is updated + +on: + push: + # Run when a change to these files is pushed to any branch. Without the "branches:" line, for some reason this will be run whenever a tag is pushed, even if the listed files aren't changed. + branches: ['*'] + paths: + - 'doc/doc-builder' + - '.github/workflows/docs-update-dependency-common.yml' + + pull_request: + # Run on pull requests that change the listed files + paths: + - 'doc/doc-builder' + - '.github/workflows/docs-update-dependency-common.yml' + + workflow_dispatch: + +permissions: + contents: read +jobs: + test-update-dependency: + if: ${{ always() }} + name: Tests to run when either docs dependency is updated + uses: ./.github/workflows/docs-update-dependency-common.yml + with: + conda_env_file: python/conda_env_ctsm_py.yml + conda_env_name: ctsm_pylib + + test-rv-setup: + if: ${{ always() }} + runs-on: ubuntu-latest + steps: + - name: build_docs rv method + run: | + cd doc/testing && ./testing.sh From 40dc05da88642bead0034a97a33225131c2c3c24 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:53:27 -0600 Subject: [PATCH 23/97] Dont run some jobs on forks. --- .github/workflows/docs-omnibus.yml | 3 +++ .github/workflows/docs-update-ctsm_pylib.yml | 4 +++- .github/workflows/docs-update-doc-builder.yml | 8 ++++++-- 3 files changed, 12 insertions(+), 3 deletions(-) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index fc0973cfc8..587bc92731 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -18,6 +18,9 @@ on: jobs: build-docs-omnibus-test: + # Don't run on forks + if: ${{ github.repository == 'ESCOMP/CTSM' }} + runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml index 789288311d..4d62795231 100644 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -22,7 +22,9 @@ permissions: contents: read jobs: test-update-dependency: - if: ${{ always() }} + # Don't run on forks + if: ${{ github.repository == 'ESCOMP/CTSM' }} + name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml with: diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index cf4422a220..8ac8b2e320 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -20,7 +20,9 @@ permissions: contents: read jobs: test-update-dependency: - if: ${{ always() }} + # Don't run on forks + if: ${{ github.repository == 'ESCOMP/CTSM' }} + name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml with: @@ -28,7 +30,9 @@ jobs: conda_env_name: ctsm_pylib test-rv-setup: - if: ${{ always() }} + # Don't run on forks + if: ${{ github.repository == 'ESCOMP/CTSM' }} + runs-on: ubuntu-latest steps: - name: build_docs rv method From cf6acee086f67e06565f6f3346437198394ca5d7 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 15:54:13 -0600 Subject: [PATCH 24/97] Fix test-rv-setup. --- .github/workflows/docs-update-doc-builder.yml | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index 8ac8b2e320..71987be266 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -35,6 +35,16 @@ jobs: runs-on: ubuntu-latest steps: + - uses: actions/checkout@v4 + with: + fetch-depth: 0 + lfs: true + + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules + run: | + bin/git-fleximod update doc-builder + - name: build_docs rv method run: | - cd doc/testing && ./testing.sh + cd doc/testing && ./test_build_docs_-r-v.sh From 115491ffd2f7d26116f5323c18e4150a277c8c40 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:01:16 -0600 Subject: [PATCH 25/97] test-rv-setup: Use Docker. --- .github/workflows/docs-update-doc-builder.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index 71987be266..53b69336ce 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -47,4 +47,4 @@ jobs: - name: build_docs rv method run: | - cd doc/testing && ./test_build_docs_-r-v.sh + cd doc/testing && ./test_build_docs_-r-v.sh docker From 00fd3bf0a0f77a92cc91e157291a352b69e97db6 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:01:46 -0600 Subject: [PATCH 26/97] testing.sh: Remove emoji from comments. --- doc/testing/testing.sh | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/doc/testing/testing.sh b/doc/testing/testing.sh index 1e009362b0..b6165d6e36 100755 --- a/doc/testing/testing.sh +++ b/doc/testing/testing.sh @@ -7,20 +7,20 @@ cd "${SCRIPT_DIR}/" # Compare docs built with container vs. ctsm_pylib ./test_container_eq_ctsm_pylib.sh -# ✅ Check that -r -v works (Docker) +# Check that -r -v works (Docker) # Also do a custom --conf-py-path and other stuff rm -rf _build ./test_build_docs_-r-v.sh docker -# ✅ Check that Makefile method works +# Check that Makefile method works rm -rf _build conda run --no-capture-output -n ctsm_pylib ./test_makefile_method.sh -# ✅ Check that -b works +# Check that -b works rm -rf _build ./test_build_docs_-b.sh docker -# ✅ Check that doc-builder tests pass +# Check that doc-builder tests pass # Don't run if on a GitHub runner; failing 🤷. Trust that doc-builder does this test. if [[ "${GITHUB_ACTIONS}" == "" ]]; then ./test_doc-builder_tests.sh From 1dbd0f48f1539c787fc9a6072ed582b63078e36e Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:06:19 -0600 Subject: [PATCH 27/97] docs.yml: Don't run if just testing scripts were updated. --- .github/workflows/docs.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index 074a674ffe..ddd6ed54f4 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -7,6 +7,7 @@ on: branches: ['*'] paths: - 'doc/**' + - '!doc/testing/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' @@ -19,6 +20,7 @@ on: # Run on pull requests that change the listed files paths: - 'doc/**' + - '!doc/testing/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' From e0ceb1d6377aec95402009d32b94ce2e83ebcc63 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:11:17 -0600 Subject: [PATCH 28/97] Run docs workflows if included file in doc/testing/ updates. --- .github/workflows/docs-build-and-deploy.yml | 2 ++ .github/workflows/docs.yml | 2 ++ 2 files changed, 4 insertions(+) diff --git a/.github/workflows/docs-build-and-deploy.yml b/.github/workflows/docs-build-and-deploy.yml index 2c928e0ccb..e8e9527903 100644 --- a/.github/workflows/docs-build-and-deploy.yml +++ b/.github/workflows/docs-build-and-deploy.yml @@ -6,12 +6,14 @@ on: branches: ['master', 'release-clm5.0'] paths: - 'doc/**' + - '!doc/testing/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' + - 'doc/testing/test_container_eq_ctsm_pylib.sh' # Allows you to run this workflow manually from the Actions tab workflow_dispatch: diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index ddd6ed54f4..eda38a5e23 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -15,6 +15,7 @@ on: # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' + - 'doc/testing/test_container_eq_ctsm_pylib.sh' pull_request: # Run on pull requests that change the listed files @@ -28,6 +29,7 @@ on: # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' + - 'doc/testing/test_container_eq_ctsm_pylib.sh' workflow_dispatch: From 245319965f54e3a94fe651c467bcd83393148c52 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:13:37 -0600 Subject: [PATCH 29/97] Trying to fix docs-omnibus workflow. --- doc/testing/testing.sh | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/doc/testing/testing.sh b/doc/testing/testing.sh index b6165d6e36..689eacbc65 100755 --- a/doc/testing/testing.sh +++ b/doc/testing/testing.sh @@ -9,20 +9,24 @@ cd "${SCRIPT_DIR}/" # Check that -r -v works (Docker) # Also do a custom --conf-py-path and other stuff +cd "${SCRIPT_DIR}/" rm -rf _build ./test_build_docs_-r-v.sh docker # Check that Makefile method works +cd "${SCRIPT_DIR}/" rm -rf _build conda run --no-capture-output -n ctsm_pylib ./test_makefile_method.sh # Check that -b works +cd "${SCRIPT_DIR}/" rm -rf _build ./test_build_docs_-b.sh docker # Check that doc-builder tests pass # Don't run if on a GitHub runner; failing 🤷. Trust that doc-builder does this test. if [[ "${GITHUB_ACTIONS}" == "" ]]; then + cd "${SCRIPT_DIR}/" ./test_doc-builder_tests.sh fi From 53d034b1020b45c6dd4ee9ec16dde51256ef311f Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:22:03 -0600 Subject: [PATCH 30/97] Reduce docs-update-ctsm_pylib runs. Don't need to run on updates of docs-update-dependency-common.yml because that's already being checked by docs-update-doc-builder. --- .github/workflows/docs-update-ctsm_pylib.yml | 2 -- 1 file changed, 2 deletions(-) diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml index 4d62795231..64231ee515 100644 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -7,14 +7,12 @@ on: paths: - 'python/conda_env_ctsm_py.txt' - 'doc/ctsm-docs_container/requirements.txt' - - '.github/workflows/docs-update-dependency-common.yml' pull_request: # Run on pull requests that change the listed files paths: - 'python/conda_env_ctsm_py.txt' - 'doc/ctsm-docs_container/requirements.txt' - - '.github/workflows/docs-update-dependency-common.yml' workflow_dispatch: From cd22a34931dfce4d414663e600259a0b27093c13 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 5 Jun 2025 16:26:58 -0600 Subject: [PATCH 31/97] Docs workflows: Always checkout all submodules. --- .github/workflows/docker-image-common.yml | 8 +++++--- .github/workflows/docs-build-and-deploy.yml | 6 +++++- .github/workflows/docs-common.yml | 5 +++-- .github/workflows/docs-update-dependency-common.yml | 4 ++-- .github/workflows/docs-update-doc-builder.yml | 2 +- .github/workflows/docs.yml | 5 +++-- 6 files changed, 19 insertions(+), 11 deletions(-) diff --git a/.github/workflows/docker-image-common.yml b/.github/workflows/docker-image-common.yml index d44c14c1f8..3522069132 100644 --- a/.github/workflows/docker-image-common.yml +++ b/.github/workflows/docker-image-common.yml @@ -76,14 +76,16 @@ jobs: tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} - # Try building our docs using the new container - - name: Checkout doc-builder external + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o + - name: Set image tag for docs build id: set-image-tag run: | echo "IMAGE_TAG=$(echo '${{ steps.meta.outputs.tags }}' | head -n 1 | cut -d',' -f1)" >> $GITHUB_ENV + - name: Build docs using Docker (Podman has trouble on GitHub runners) id: build-docs run: | diff --git a/.github/workflows/docs-build-and-deploy.yml b/.github/workflows/docs-build-and-deploy.yml index e8e9527903..72be23d0f8 100644 --- a/.github/workflows/docs-build-and-deploy.yml +++ b/.github/workflows/docs-build-and-deploy.yml @@ -48,10 +48,14 @@ jobs: - name: Setup Pages uses: actions/configure-pages@v5 + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules + run: | + bin/git-fleximod update -o + - name: Build docs using container id: build-docs run: | - bin/git-fleximod update -o cd doc ./build_docs_to_publish -d --site-root https://escomp.github.io/CTSM diff --git a/.github/workflows/docs-common.yml b/.github/workflows/docs-common.yml index 6dd8f7d53b..9c9d9f386c 100644 --- a/.github/workflows/docs-common.yml +++ b/.github/workflows/docs-common.yml @@ -26,9 +26,10 @@ jobs: fetch-depth: 0 lfs: true - - name: Checkout doc-builder external + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o # Do this if not using conda # Based on https://github.com/actions/cache/blob/main/examples.md#python---pip diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index a7e815a27b..eee702c62f 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -25,7 +25,7 @@ jobs: # Check out all submodules because we might :literalinclude: something from one - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o - name: Set up conda environment uses: conda-incubator/setup-miniconda@v3 @@ -51,7 +51,7 @@ jobs: # Check out all submodules because we might :literalinclude: something from one - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o - name: Set up conda environment uses: conda-incubator/setup-miniconda@v3 diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index 53b69336ce..74827aea11 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -43,7 +43,7 @@ jobs: # Check out all submodules because we might :literalinclude: something from one - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o - name: build_docs rv method run: | diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index eda38a5e23..47c1da5345 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -53,9 +53,10 @@ jobs: - name: Checkout repository uses: actions/checkout@v4 - - name: Checkout doc-builder external + # Check out all submodules because we might :literalinclude: something from one + - name: Checkout all submodules run: | - bin/git-fleximod update doc-builder + bin/git-fleximod update -o - name: Build docs using Docker (Podman has trouble on GitHub runners) id: build-docs From 4848e217ac667264515e2bf027eca6c2fc684709 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 6 Jun 2025 09:10:18 -0600 Subject: [PATCH 32/97] Bump docs container version to satisfy workflow check. No actual difference in container. --- doc/ctsm-docs_container/Dockerfile | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/ctsm-docs_container/Dockerfile b/doc/ctsm-docs_container/Dockerfile index 2ffd7a1702..5c78a0c14f 100644 --- a/doc/ctsm-docs_container/Dockerfile +++ b/doc/ctsm-docs_container/Dockerfile @@ -29,4 +29,4 @@ CMD ["/bin/bash", "-l"] LABEL org.opencontainers.image.title="Container for building CTSM documentation" LABEL org.opencontainers.image.source=https://github.com/ESCOMP/CTSM -LABEL org.opencontainers.image.version="v1.0.2c" +LABEL org.opencontainers.image.version="v1.0.2d" From d5b7f1ea7fadb0f4aa645799df085517b62082c1 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 6 Jun 2025 09:12:19 -0600 Subject: [PATCH 33/97] docker-image-build.yml: Remove nonexistent file from triggers. --- .github/workflows/docker-image-build.yml | 2 -- 1 file changed, 2 deletions(-) diff --git a/.github/workflows/docker-image-build.yml b/.github/workflows/docker-image-build.yml index 0ac43426a6..1512daeed6 100644 --- a/.github/workflows/docker-image-build.yml +++ b/.github/workflows/docker-image-build.yml @@ -9,7 +9,6 @@ on: paths: - 'doc/ctsm-docs_container/**' - '!doc/ctsm-docs_container/README.md' - - '.github/workflows/docker-image-ctsm-docs-build.yml' - '.github/workflows/docker-image-common.yml' pull_request: @@ -17,7 +16,6 @@ on: paths: - 'doc/ctsm-docs_container/**' - '!doc/ctsm-docs_container/README.md' - - '.github/workflows/docker-image-ctsm-docs-build.yml' - '.github/workflows/docker-image-common.yml' workflow_dispatch: From 50fe21e794ad360b604247817ae1fdbdae032a51 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Mon, 9 Jun 2025 14:26:47 -0600 Subject: [PATCH 34/97] Add some EOF blank lines. --- doc/testing/compose_test_cmd.sh | 2 +- doc/testing/test_doc-builder_tests.sh | 2 +- doc/testing/testing.sh | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/doc/testing/compose_test_cmd.sh b/doc/testing/compose_test_cmd.sh index 3d010f166d..2b2fd3cf67 100755 --- a/doc/testing/compose_test_cmd.sh +++ b/doc/testing/compose_test_cmd.sh @@ -10,4 +10,4 @@ else echo "${msg} (container: ${cli_tool})" fi -echo cmd \ No newline at end of file +echo cmd diff --git a/doc/testing/test_doc-builder_tests.sh b/doc/testing/test_doc-builder_tests.sh index 00d576c298..62c8759587 100755 --- a/doc/testing/test_doc-builder_tests.sh +++ b/doc/testing/test_doc-builder_tests.sh @@ -9,4 +9,4 @@ cd ../doc-builder/test set -x conda run --no-capture-output -n ctsm_pylib make test -exit 0 \ No newline at end of file +exit 0 diff --git a/doc/testing/testing.sh b/doc/testing/testing.sh index 689eacbc65..bd1c1ca530 100755 --- a/doc/testing/testing.sh +++ b/doc/testing/testing.sh @@ -30,4 +30,4 @@ if [[ "${GITHUB_ACTIONS}" == "" ]]; then ./test_doc-builder_tests.sh fi -exit 0 \ No newline at end of file +exit 0 From 4a3040988b113147d2b0a08f756040df854bb322 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 13:40:10 -0600 Subject: [PATCH 35/97] Add initial unit tests for subset_data point. Failing. --- python/ctsm/test/test_sys_subset_data.py | 89 +++++++++++++++++++ ...P_amazon_hist_16pfts_CMIP6_2000_c250617.nc | 3 + .../test_subset_data_pt_amazon_type360 | 1 + 3 files changed, 93 insertions(+) create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc create mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index bc73c8c41d..7010e9d405 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -185,6 +185,95 @@ def test_subset_data_reg_infile_detect180_error(self): ): subset_data.main() + def test_subset_data_pt_amazon_type360(self): + """ + Test subset_data for Amazon point with longitude type 360 + """ + cfg_file = os.path.join( + self.inputdata_dir, + "ctsm", + "test", + "testinputs", + "subset_data_amazon.cfg", + ) + print(cfg_file) + sys.argv = [ + "subset_data", + "point", + "--lat", + "-12", + "--lon", + "291", + "--site", + "TMP", + "--create-domain", + "--create-surface", + "--surf-year", + "2000", + "--create-user-mods", + "--outdir", + self.temp_dir_out.name, + "--user-mods-dir", + self.temp_dir_umd.name, + "--inputdata-dir", + self.inputdata_dir, + "--cfg-file", + cfg_file, + "--overwrite", + ] + subset_data.main() + + # Loop through all the output files, making sure they match what we expect. + daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd + expected_output_files = [ + f"surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c{daystr}.nc", + ] + self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + + def test_subset_data_pt_amazon_type180(self): + """ + Test subset_data for Amazon point with longitude type 180 + """ + cfg_file = os.path.join( + self.inputdata_dir, + "ctsm", + "test", + "testinputs", + "subset_data_amazon.cfg", + ) + print(cfg_file) + sys.argv = [ + "subset_data", + "point", + "--lat", + "-12", + "--lon", + "-69", + "--site", + "TMP", + "--create-surface", + "--surf-year", + "2000", + "--create-user-mods", + "--outdir", + self.temp_dir_out.name, + "--user-mods-dir", + self.temp_dir_umd.name, + "--inputdata-dir", + self.inputdata_dir, + "--cfg-file", + cfg_file, + "--overwrite", + ] + subset_data.main() + + # Loop through all the output files, making sure they match what we expect. + daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd + expected_output_files = [ + f"surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c{daystr}.nc", + ] + self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + if __name__ == "__main__": unit_testing.setup_for_tests() diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc new file mode 100644 index 0000000000..6e742560d0 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e694ca46925fbe07270b5468fe3899ead98dcc7d41353a6551dcc1ec92a9f9e0 +size 27740 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 new file mode 120000 index 0000000000..997afa40bb --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 @@ -0,0 +1 @@ +test_subset_data_pt_amazon_type180 \ No newline at end of file From 367317ecbb1e3e92dc948797503fe233046d8045 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 14:56:02 -0600 Subject: [PATCH 36/97] subset_data point: Fix --create-surface Longitude TypeError. --- .../site_and_regional/single_point_case.py | 30 +++++++++++++++++-- 1 file changed, 27 insertions(+), 3 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index bd16bae226..1bd5581310 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -15,6 +15,7 @@ # -- import local classes for this script from ctsm.site_and_regional.base_case import BaseCase, USRDAT_DIR, DatmFiles from ctsm.utils import add_tag_to_filename, ensure_iterable +from ctsm.longitude import _detect_lon_type logger = logging.getLogger(__name__) @@ -151,6 +152,26 @@ def __init__( # self.check_nonveg() self.check_pct_pft() + def convert_plon_to_filetype_if_needed(self, input_ds): + """ + Check that point and input file longitude types are equal. If not, convert point to match + file. + """ + plon_in = self.plon + f_lon_type = _detect_lon_type(input_ds["lsmlon"]) + plon_type = plon_in.lon_type() + if f_lon_type == plon_type: + plon_out = plon_in.get(plon_type) + else: + plon_orig = plon_in.get(plon_type) + plon_out = plon_in.get(f_lon_type) + if plon_orig != plon_out: + print( + f"Converted plon from type {plon_type} (value {plon_orig}) " + f"to type {f_lon_type} (value {plon_out})" + ) + return plon_out + def create_tag(self): """ Create a tag for single point which is the site name @@ -498,8 +519,11 @@ def create_surfdata_at_point(self, indir, file, user_mods_dir, specify_fsurf_out # create 1d coordinate variables to enable sel() method f_in = self.create_1d_coord(fsurf_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") + # get point longitude, converting to match file type if needed + plon_converted = self.convert_plon_to_filetype_if_needed(f_in) + # extract gridcell closest to plon/plat - f_tmp = f_in.sel(lsmlon=self.plon, lsmlat=self.plat, method="nearest") + f_tmp = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") # expand dimensions f_tmp = f_tmp.expand_dims(["lsmlat", "lsmlon"]).copy(deep=True) @@ -525,10 +549,10 @@ def create_surfdata_at_point(self, indir, file, user_mods_dir, specify_fsurf_out # update lsmlat and lsmlon to match site specific instead of the nearest point # we do this so that if we create user_mods the PTS_LON and PTS_LAT in CIME match # the surface data coordinates - which is required - f_out["lsmlon"] = np.atleast_1d(self.plon) + f_out["lsmlon"] = np.atleast_1d(plon_converted) f_out["lsmlat"] = np.atleast_1d(self.plat) f_out["LATIXY"][:, :] = self.plat - f_out["LONGXY"][:, :] = self.plon + f_out["LONGXY"][:, :] = plon_converted # update attributes self.update_metadata(f_out) From daa218c092919970c4ae05ff0bac005d2ed14f33 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 15:21:41 -0600 Subject: [PATCH 37/97] subset_data point: Fix --create-landuse Longitude TypeError. --- python/ctsm/site_and_regional/single_point_case.py | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index 1bd5581310..5eefc1bd89 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -384,8 +384,11 @@ def create_landuse_at_point(self, indir, file, user_mods_dir): # create 1d coordinate variables to enable sel() method f_in = self.create_1d_coord(fluse_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") + # get point longitude, converting to match file type if needed + plon_converted = self.convert_plon_to_filetype_if_needed(f_in) + # extract gridcell closest to plon/plat - f_out = f_in.sel(lsmlon=self.plon, lsmlat=self.plat, method="nearest") + f_out = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["lsmlat", "lsmlon"]) From 3f5d157acf70b0901cb4c01d2951152be8e0d1d7 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 15:26:23 -0600 Subject: [PATCH 38/97] subset_data point: Fix --create-datm Longitude TypeError. --- python/ctsm/site_and_regional/single_point_case.py | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index 5eefc1bd89..e97b1f3baa 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -152,13 +152,13 @@ def __init__( # self.check_nonveg() self.check_pct_pft() - def convert_plon_to_filetype_if_needed(self, input_ds): + def convert_plon_to_filetype_if_needed(self, lon_da): """ Check that point and input file longitude types are equal. If not, convert point to match file. """ plon_in = self.plon - f_lon_type = _detect_lon_type(input_ds["lsmlon"]) + f_lon_type = _detect_lon_type(lon_da) plon_type = plon_in.lon_type() if f_lon_type == plon_type: plon_out = plon_in.get(plon_type) @@ -385,7 +385,7 @@ def create_landuse_at_point(self, indir, file, user_mods_dir): f_in = self.create_1d_coord(fluse_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in) + plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) # extract gridcell closest to plon/plat f_out = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") @@ -523,7 +523,7 @@ def create_surfdata_at_point(self, indir, file, user_mods_dir, specify_fsurf_out f_in = self.create_1d_coord(fsurf_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in) + plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) # extract gridcell closest to plon/plat f_tmp = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") @@ -595,8 +595,11 @@ def create_datmdomain_at_point(self, datm_tuple: DatmFiles): # create 1d coordinate variables to enable sel() method f_in = self.create_1d_coord(fdatmdomain_in, "xc", "yc", "ni", "nj") + # get point longitude, converting to match file type if needed + plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lon"]) + # extract gridcell closest to plon/plat - f_out = f_in.sel(ni=self.plon, nj=self.plat, method="nearest") + f_out = f_in.sel(ni=plon_converted, nj=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["nj", "ni"]) From 8e2df09282cb6ba6c9a9b8eaf652eeffef167aeb Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 16:02:09 -0600 Subject: [PATCH 39/97] subset_data point: Fix filenames for --create-datm. Resolves ESCOMP/CTSM#3260. --- .../site_and_regional/single_point_case.py | 83 +++++++++---------- 1 file changed, 38 insertions(+), 45 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index e97b1f3baa..ac0d5c1456 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -23,10 +23,6 @@ NUM_PFT = 17 # for runs with generic crops MAX_PFT = 78 # for runs with explicit crops -# -- constants to represent months of year -FIRST_MONTH = 1 -LAST_MONTH = 12 - class SinglePointCase(BaseCase): """ @@ -621,14 +617,17 @@ def extract_datm_at(self, file_in, file_out): # create 1d coordinate variables to enable sel() method f_in = self.create_1d_coord(file_in, "LONGXY", "LATIXY", "lon", "lat") + # get point longitude, converting to match file type if needed + plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lon"]) + # extract gridcell closest to plon/plat - f_out = f_in.sel(lon=self.plon, lat=self.plat, method="nearest") + f_out = f_in.sel(lon=plon_converted, lat=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["lat", "lon"]) # specify dimension order - f_out = f_out.transpose("scalar", "time", "lat", "lon") + f_out = f_out.transpose("time", "lat", "lon") # update attributes self.update_metadata(f_out) @@ -683,46 +682,40 @@ def create_datm_at_point(self, datm_tuple: DatmFiles, datm_syr, datm_eyr, datm_s tpqwfiles = [] for year in range(datm_syr, datm_eyr + 1): ystr = str(year) - for month in range(FIRST_MONTH, LAST_MONTH + 1): - mstr = str(month) - if month < 10: - mstr = "0" + mstr - dtag = ystr + "-" + mstr - - fsolar = os.path.join( - datm_tuple.indir, - datm_tuple.dir_solar, - "{}{}.nc".format(datm_tuple.tag_solar, dtag), - ) - fsolar2 = "{}{}.{}.nc".format(datm_tuple.tag_solar, self.tag, dtag) - fprecip = os.path.join( - datm_tuple.indir, - datm_tuple.dir_prec, - "{}{}.nc".format(datm_tuple.tag_prec, dtag), - ) - fprecip2 = "{}{}.{}.nc".format(datm_tuple.tag_prec, self.tag, dtag) - ftpqw = os.path.join( - datm_tuple.indir, - datm_tuple.dir_tpqw, - "{}{}.nc".format(datm_tuple.tag_tpqw, dtag), - ) - ftpqw2 = "{}{}.{}.nc".format(datm_tuple.tag_tpqw, self.tag, dtag) - - outdir = os.path.join(self.out_dir, datm_tuple.outdir) - infile += [fsolar, fprecip, ftpqw] - outfile += [ - os.path.join(outdir, fsolar2), - os.path.join(outdir, fprecip2), - os.path.join(outdir, ftpqw2), - ] - solarfiles.append( - os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fsolar2) - ) - precfiles.append( - os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fprecip2) - ) - tpqwfiles.append(os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, ftpqw2)) + fsolar = os.path.join( + datm_tuple.indir, + datm_tuple.dir_solar, + "{}{}.nc".format(datm_tuple.tag_solar, ystr), + ) + fsolar2 = "{}{}.{}.nc".format(datm_tuple.tag_solar, self.tag, ystr) + fprecip = os.path.join( + datm_tuple.indir, + datm_tuple.dir_prec, + "{}{}.nc".format(datm_tuple.tag_prec, ystr), + ) + fprecip2 = "{}{}.{}.nc".format(datm_tuple.tag_prec, self.tag, ystr) + ftpqw = os.path.join( + datm_tuple.indir, + datm_tuple.dir_tpqw, + "{}{}.nc".format(datm_tuple.tag_tpqw, ystr), + ) + ftpqw2 = "{}{}.{}.nc".format(datm_tuple.tag_tpqw, self.tag, ystr) + + outdir = os.path.join(self.out_dir, datm_tuple.outdir) + infile += [fsolar, fprecip, ftpqw] + outfile += [ + os.path.join(outdir, fsolar2), + os.path.join(outdir, fprecip2), + os.path.join(outdir, ftpqw2), + ] + solarfiles.append( + os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fsolar2) + ) + precfiles.append( + os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fprecip2) + ) + tpqwfiles.append(os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, ftpqw2)) for idx, out_f in enumerate(outfile): logger.debug(out_f) From 7d651f0929a2db6da4c3bc3e12ff72e3ebd5c181 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 16:54:45 -0600 Subject: [PATCH 40/97] Add Python system test for subset_data point --create-datm. --- python/ctsm/test/test_sys_subset_data.py | 61 ++++++++++++++++++++++-- 1 file changed, 57 insertions(+), 4 deletions(-) diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 7010e9d405..3812cf5168 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -185,9 +185,9 @@ def test_subset_data_reg_infile_detect180_error(self): ): subset_data.main() - def test_subset_data_pt_amazon_type360(self): + def test_subset_data_pt_amazon_type360_surface(self): """ - Test subset_data for Amazon point with longitude type 360 + Test subset_data --create-surface for Amazon point with longitude type 360 """ cfg_file = os.path.join( self.inputdata_dir, @@ -230,9 +230,9 @@ def test_subset_data_pt_amazon_type360(self): ] self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) - def test_subset_data_pt_amazon_type180(self): + def test_subset_data_pt_amazon_type180_surface(self): """ - Test subset_data for Amazon point with longitude type 180 + Test subset_data --create-surface for Amazon point with longitude type 180 """ cfg_file = os.path.join( self.inputdata_dir, @@ -274,6 +274,59 @@ def test_subset_data_pt_amazon_type180(self): ] self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + def test_subset_data_pt_amazon_type360_datm(self): + """ + Test subset_data --create-datm for Amazon point with longitude type 360 + FOR NOW CAN ONLY BE RUN ON DERECHO/CASPER + """ + start_year = 1986 + end_year = 1988 + sitename = "TMP" + # outdir = self.temp_dir_out.name + outdir = "/glade/work/samrabin/ctsm/python/abc456" + sys.argv = [ + "subset_data", + "point", + "--lat", + "-12", + "--lon", + "291", + "--site", + sitename, + "--create-datm", + "--datm-syr", + str(start_year), + "--datm-eyr", + str(end_year), + "--create-user-mods", + "--outdir", + outdir, + "--user-mods-dir", + self.temp_dir_umd.name, + "--overwrite", + ] + subset_data.main() + + # Loop through all the output files, making sure they exist. + daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd + expected_output_files = [ + f"domain.crujra_v2.3_0.5x0.5_{sitename}_c{daystr}.nc", + ] + for year in list(range(start_year, end_year + 1)): + for forcing in ["Solr", "Prec", "TPQWL"]: + expected_output_files.append( + f"clmforc.CRUJRAv2.5_0.5x0.5.{forcing}.{sitename}.{year}.nc" + ) + for file_basename in expected_output_files: + file_path = os.path.join(outdir, "datmdata", file_basename) + # The below will error if exactly one matching file isn't found + try: + find_one_file_matching_pattern(file_path) + except FileNotFoundError as e: + raise AssertionError(str(e)) from e + except RuntimeError as e: + raise AssertionError(str(e)) from e + if __name__ == "__main__": unit_testing.setup_for_tests() From 4ad46f46de7dde753b4653c15f05326f55116b73 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 16:55:27 -0600 Subject: [PATCH 41/97] Reformat with black. --- python/ctsm/site_and_regional/single_point_case.py | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index ac0d5c1456..ed8b4b5562 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -709,12 +709,8 @@ def create_datm_at_point(self, datm_tuple: DatmFiles, datm_syr, datm_eyr, datm_s os.path.join(outdir, fprecip2), os.path.join(outdir, ftpqw2), ] - solarfiles.append( - os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fsolar2) - ) - precfiles.append( - os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fprecip2) - ) + solarfiles.append(os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fsolar2)) + precfiles.append(os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, fprecip2)) tpqwfiles.append(os.path.join("${}".format(USRDAT_DIR), datm_tuple.outdir, ftpqw2)) for idx, out_f in enumerate(outfile): From 4400458dbce9682be3e6323d7470084f0bec4438 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 16:56:00 -0600 Subject: [PATCH 42/97] Add previous commit to .git-blame-ignore-revs. --- .git-blame-ignore-revs | 1 + 1 file changed, 1 insertion(+) diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs index 7ea285a6bc..6cffe9dd35 100644 --- a/.git-blame-ignore-revs +++ b/.git-blame-ignore-revs @@ -67,3 +67,4 @@ cdf40d265cc82775607a1bf25f5f527bacc97405 3b7a2876933263f8986e4069f5d23bd45635756f 3dd489af7ebe06566e2c6a1c7ade18550f1eb4ba 742cfa606039ab89602fde5fef46458516f56fd4 +4ad46f46de7dde753b4653c15f05326f55116b73 From ad40e112671489b4317ecbc614d7b5c54877ede7 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 17:07:15 -0600 Subject: [PATCH 43/97] Fix test_sys_subset_data.py. --- python/ctsm/test/test_sys_subset_data.py | 3 +-- .../surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc | 0 .../expected_result_files/test_subset_data_pt_amazon_type360 | 1 - .../test_subset_data_pt_amazon_type360_surface | 1 + 4 files changed, 2 insertions(+), 3 deletions(-) rename python/ctsm/test/testinputs/expected_result_files/{test_subset_data_pt_amazon_type180 => test_subset_data_pt_amazon_type180_surface}/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc (100%) delete mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 create mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 3812cf5168..f12e7ad45e 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -282,8 +282,7 @@ def test_subset_data_pt_amazon_type360_datm(self): start_year = 1986 end_year = 1988 sitename = "TMP" - # outdir = self.temp_dir_out.name - outdir = "/glade/work/samrabin/ctsm/python/abc456" + outdir = self.temp_dir_out.name sys.argv = [ "subset_data", "point", diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180_surface/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc similarity index 100% rename from python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc rename to python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180_surface/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 deleted file mode 120000 index 997afa40bb..0000000000 --- a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360 +++ /dev/null @@ -1 +0,0 @@ -test_subset_data_pt_amazon_type180 \ No newline at end of file diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface new file mode 120000 index 0000000000..7bcdf69458 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface @@ -0,0 +1 @@ +test_subset_data_pt_amazon_type180_surface \ No newline at end of file From 944bf273914e9bf473a0cd885fb7a08f0c6e4c61 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 17:09:00 -0600 Subject: [PATCH 44/97] Add test_subset_data_pt_amazon_type180_datm --- python/ctsm/test/test_sys_subset_data.py | 52 ++++++++++++++++++++++++ 1 file changed, 52 insertions(+) diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index f12e7ad45e..d33a00e80f 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -326,6 +326,58 @@ def test_subset_data_pt_amazon_type360_datm(self): except RuntimeError as e: raise AssertionError(str(e)) from e + def test_subset_data_pt_amazon_type180_datm(self): + """ + Test subset_data --create-datm for Amazon point with longitude type 180 + FOR NOW CAN ONLY BE RUN ON DERECHO/CASPER + """ + start_year = 1986 + end_year = 1988 + sitename = "TMP" + outdir = self.temp_dir_out.name + sys.argv = [ + "subset_data", + "point", + "--lat", + "-12", + "--lon", + "-69", + "--site", + sitename, + "--create-datm", + "--datm-syr", + str(start_year), + "--datm-eyr", + str(end_year), + "--create-user-mods", + "--outdir", + outdir, + "--user-mods-dir", + self.temp_dir_umd.name, + "--overwrite", + ] + subset_data.main() + + # Loop through all the output files, making sure they exist. + daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd + expected_output_files = [ + f"domain.crujra_v2.3_0.5x0.5_{sitename}_c{daystr}.nc", + ] + for year in list(range(start_year, end_year + 1)): + for forcing in ["Solr", "Prec", "TPQWL"]: + expected_output_files.append( + f"clmforc.CRUJRAv2.5_0.5x0.5.{forcing}.{sitename}.{year}.nc" + ) + for file_basename in expected_output_files: + file_path = os.path.join(outdir, "datmdata", file_basename) + # The below will error if exactly one matching file isn't found + try: + find_one_file_matching_pattern(file_path) + except FileNotFoundError as e: + raise AssertionError(str(e)) from e + except RuntimeError as e: + raise AssertionError(str(e)) from e + if __name__ == "__main__": unit_testing.setup_for_tests() From 51c87410cb55fdd6c6fdc909b36f3808e217d22a Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 17:19:56 -0600 Subject: [PATCH 45/97] Refactor test_sys_subset_data.py. --- python/ctsm/test/test_sys_subset_data.py | 125 ++++-------------- ...test_subset_data_pt_amazon_type360_surface | 1 - ...P_amazon_hist_16pfts_CMIP6_2000_c250617.nc | 0 ...test_subset_data_pt_surface_amazon_type360 | 1 + 4 files changed, 28 insertions(+), 99 deletions(-) delete mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface rename python/ctsm/test/testinputs/expected_result_files/{test_subset_data_pt_amazon_type180_surface => test_subset_data_pt_surface_amazon_type180}/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc (100%) create mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type360 diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index d33a00e80f..49e2c66f4e 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -37,7 +37,7 @@ def tearDown(self): self.temp_dir_out.cleanup() self.temp_dir_umd.cleanup() - def _check_result_file_matches_expected(self, expected_output_files): + def _check_result_file_matches_expected(self, expected_output_files, caller_n): """ Loop through a list of output files, making sure they match what we expect. """ @@ -49,7 +49,7 @@ def _check_result_file_matches_expected(self, expected_output_files): os.path.dirname(__file__), "testinputs", "expected_result_files", - inspect.stack()[1][3], # Name of calling function (i.e., test name) + inspect.stack()[caller_n][3], # Name of calling function (i.e., test name) basename, ) expected_file = find_one_file_matching_pattern(expected_file) @@ -112,7 +112,7 @@ def test_subset_data_reg_amazon(self): f"domain.lnd.5x5pt-amazon_navy_TMP_c{daystr}.nc", f"surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c{daystr}.nc", ] - self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + self.assertTrue(self._check_result_file_matches_expected(expected_output_files, 1)) def test_subset_data_reg_infile_detect360(self): """ @@ -185,9 +185,9 @@ def test_subset_data_reg_infile_detect180_error(self): ): subset_data.main() - def test_subset_data_pt_amazon_type360_surface(self): + def _do_test_subset_data_pt_surface(self, lon): """ - Test subset_data --create-surface for Amazon point with longitude type 360 + Given a longitude, test subset_data point --create-surface """ cfg_file = os.path.join( self.inputdata_dir, @@ -203,7 +203,7 @@ def test_subset_data_pt_amazon_type360_surface(self): "--lat", "-12", "--lon", - "291", + str(lon), "--site", "TMP", "--create-domain", @@ -228,56 +228,23 @@ def test_subset_data_pt_amazon_type360_surface(self): expected_output_files = [ f"surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c{daystr}.nc", ] - self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + self.assertTrue(self._check_result_file_matches_expected(expected_output_files, 2)) - def test_subset_data_pt_amazon_type180_surface(self): + def test_subset_data_pt_surface_amazon_type360(self): """ - Test subset_data --create-surface for Amazon point with longitude type 180 + Test subset_data --create-surface for Amazon point with longitude type 360 """ - cfg_file = os.path.join( - self.inputdata_dir, - "ctsm", - "test", - "testinputs", - "subset_data_amazon.cfg", - ) - print(cfg_file) - sys.argv = [ - "subset_data", - "point", - "--lat", - "-12", - "--lon", - "-69", - "--site", - "TMP", - "--create-surface", - "--surf-year", - "2000", - "--create-user-mods", - "--outdir", - self.temp_dir_out.name, - "--user-mods-dir", - self.temp_dir_umd.name, - "--inputdata-dir", - self.inputdata_dir, - "--cfg-file", - cfg_file, - "--overwrite", - ] - subset_data.main() + self._do_test_subset_data_pt_surface(291) - # Loop through all the output files, making sure they match what we expect. - daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd - expected_output_files = [ - f"surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c{daystr}.nc", - ] - self.assertTrue(self._check_result_file_matches_expected(expected_output_files)) + def test_subset_data_pt_surface_amazon_type180(self): + """ + Test subset_data --create-surface for Amazon point with longitude type 180 + """ + self._do_test_subset_data_pt_surface(-69) - def test_subset_data_pt_amazon_type360_datm(self): + def _do_test_subset_data_pt_datm(self, lon): """ - Test subset_data --create-datm for Amazon point with longitude type 360 - FOR NOW CAN ONLY BE RUN ON DERECHO/CASPER + Given a longitude, test subset_data point --create-datm """ start_year = 1986 end_year = 1988 @@ -289,7 +256,7 @@ def test_subset_data_pt_amazon_type360_datm(self): "--lat", "-12", "--lon", - "291", + str(lon), "--site", sitename, "--create-datm", @@ -326,57 +293,19 @@ def test_subset_data_pt_amazon_type360_datm(self): except RuntimeError as e: raise AssertionError(str(e)) from e - def test_subset_data_pt_amazon_type180_datm(self): + def test_subset_data_pt_datm_amazon_type360(self): """ - Test subset_data --create-datm for Amazon point with longitude type 180 + Test subset_data --create-datm for Amazon point with longitude type 360 FOR NOW CAN ONLY BE RUN ON DERECHO/CASPER """ - start_year = 1986 - end_year = 1988 - sitename = "TMP" - outdir = self.temp_dir_out.name - sys.argv = [ - "subset_data", - "point", - "--lat", - "-12", - "--lon", - "-69", - "--site", - sitename, - "--create-datm", - "--datm-syr", - str(start_year), - "--datm-eyr", - str(end_year), - "--create-user-mods", - "--outdir", - outdir, - "--user-mods-dir", - self.temp_dir_umd.name, - "--overwrite", - ] - subset_data.main() + self._do_test_subset_data_pt_datm(291) - # Loop through all the output files, making sure they exist. - daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd - expected_output_files = [ - f"domain.crujra_v2.3_0.5x0.5_{sitename}_c{daystr}.nc", - ] - for year in list(range(start_year, end_year + 1)): - for forcing in ["Solr", "Prec", "TPQWL"]: - expected_output_files.append( - f"clmforc.CRUJRAv2.5_0.5x0.5.{forcing}.{sitename}.{year}.nc" - ) - for file_basename in expected_output_files: - file_path = os.path.join(outdir, "datmdata", file_basename) - # The below will error if exactly one matching file isn't found - try: - find_one_file_matching_pattern(file_path) - except FileNotFoundError as e: - raise AssertionError(str(e)) from e - except RuntimeError as e: - raise AssertionError(str(e)) from e + def test_subset_data_pt_datm_amazon_type180(self): + """ + Test subset_data --create-datm for Amazon point with longitude type 180 + FOR NOW CAN ONLY BE RUN ON DERECHO/CASPER + """ + self._do_test_subset_data_pt_datm(-69) if __name__ == "__main__": diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface deleted file mode 120000 index 7bcdf69458..0000000000 --- a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type360_surface +++ /dev/null @@ -1 +0,0 @@ -test_subset_data_pt_amazon_type180_surface \ No newline at end of file diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180_surface/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc similarity index 100% rename from python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_amazon_type180_surface/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc rename to python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type180/surfdata_TMP_amazon_hist_16pfts_CMIP6_2000_c250617.nc diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type360 b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type360 new file mode 120000 index 0000000000..3a7bc5efe3 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_surface_amazon_type360 @@ -0,0 +1 @@ +test_subset_data_pt_surface_amazon_type180 \ No newline at end of file From cceb4d0f6c345430e9ab4da4480a3eaae00af96c Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 17 Jun 2025 17:27:23 -0600 Subject: [PATCH 46/97] Make error handling in _do_test_subset_data_pt_datm() more robust. --- python/ctsm/test/test_sys_subset_data.py | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 49e2c66f4e..501a7869fd 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -288,9 +288,7 @@ def _do_test_subset_data_pt_datm(self, lon): # The below will error if exactly one matching file isn't found try: find_one_file_matching_pattern(file_path) - except FileNotFoundError as e: - raise AssertionError(str(e)) from e - except RuntimeError as e: + except Exception as e: raise AssertionError(str(e)) from e def test_subset_data_pt_datm_amazon_type360(self): From fd8c549d96bf27b59fd6bc76c784216aef56f49f Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 11:11:44 -0600 Subject: [PATCH 47/97] Docs docs: Minor improvements. --- .../building-docs-multiple-versions.rst | 2 +- .../working-with-documentation/building-docs-original-wiki.md | 2 +- .../working-with-documentation/building-docs-prereqs-windows.md | 2 ++ 3 files changed, 4 insertions(+), 2 deletions(-) diff --git a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst index 96803d8254..dc53a2bc78 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst +++ b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst @@ -10,7 +10,7 @@ Note that this is not necessary in order for you to contribute an update to the .. literalinclude:: ../../../testing.sh :start-at: ./build_docs_to_publish :end-before: VERSION LINKS WILL NOT RESOLVE - :append: open _publish/index.html + :append: CMD _publish/index.html # where CMD is open for Mac or wslview for Windows (Ubuntu VM) **Note:** This is not yet supported with Podman on Linux (including Ubuntu VM on Windows). See `doc-builder Issue #27: build_docs_to_publish fails on Linux (maybe just Ubuntu?) with Podman `_. It does work with Docker on Linux, though. diff --git a/doc/source/users_guide/working-with-documentation/building-docs-original-wiki.md b/doc/source/users_guide/working-with-documentation/building-docs-original-wiki.md index 251622b6f0..63acab53a7 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-original-wiki.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-original-wiki.md @@ -2,7 +2,7 @@ # ⚠️ Original docs documentation from the GitHub Wiki -.. todo:: +.. warning:: ⚠️⚠️⚠️WARNING⚠️⚠️⚠️ The linked page contains documentation that (a) is more complicated than you probably require and (b) has not been fully checked for accuracy with the latest documentation setup. Unless you have a very good reason, you should probably go to :ref:`docs-intro-and-recommended`. diff --git a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md index aa4b537e58..ceb701b5cf 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md +++ b/doc/source/users_guide/working-with-documentation/building-docs-prereqs-windows.md @@ -123,6 +123,8 @@ If that's not feasible or doesn't solve the problem, you may need to remind Linu chown -R $USER:$USER $HOME ``` +If that also gives a permission error, you may need to put `sudo` at the start of the command. + ### "The host 'wsl$' was not found in the list of allowed hosts" You may see this warning in a dialog box after trying to open a file with `wslview`, `explorer.exe`, or something else. Check "Permanently allow host 'wsl$'" and then press "Allow". From 761bb47b2a717c4086a3300201cd4a8a342bcfab Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 11:35:37 -0600 Subject: [PATCH 48/97] Add subset_data Python system test for --create-landuse. --- python/ctsm/test/test_sys_subset_data.py | 59 +++++++++++++++++++ ...MP_amazon_hist_1850-1853_78pfts_c250618.nc | 3 + ...ata_TMP_amazon_hist_1850_78pfts_c250618.nc | 3 + ...test_subset_data_pt_landuse_amazon_type360 | 1 + ...x5_amazon_hist_1850-1853_78pfts_c250617.nc | 3 + .../testinputs/subset_data_amazon_1850.cfg | 14 +++++ ...ata_5x5_amazon_hist_1850_78pfts_c250617.nc | 3 + 7 files changed, 86 insertions(+) create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/landuse.timeseries_TMP_amazon_hist_1850-1853_78pfts_c250618.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/surfdata_TMP_amazon_hist_1850_78pfts_c250618.nc create mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type360 create mode 100644 python/ctsm/test/testinputs/landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc create mode 100644 python/ctsm/test/testinputs/subset_data_amazon_1850.cfg create mode 100644 python/ctsm/test/testinputs/surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 501a7869fd..42377f5b00 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -242,6 +242,65 @@ def test_subset_data_pt_surface_amazon_type180(self): """ self._do_test_subset_data_pt_surface(-69) + def _do_test_subset_data_pt_landuse(self, lon): + """ + Given a longitude, test subset_data point --create-landuse + """ + cfg_file = os.path.join( + self.inputdata_dir, + "ctsm", + "test", + "testinputs", + "subset_data_amazon_1850.cfg", + ) + print(cfg_file) + sys.argv = [ + "subset_data", + "point", + "--lat", + "-12", + "--lon", + str(lon), + "--site", + "TMP", + "--create-domain", + "--create-surface", + "--surf-year", + "1850", + "--create-landuse", + "--create-user-mods", + "--outdir", + self.temp_dir_out.name, + "--user-mods-dir", + self.temp_dir_umd.name, + "--inputdata-dir", + self.inputdata_dir, + "--cfg-file", + cfg_file, + "--overwrite", + ] + subset_data.main() + + # Loop through all the output files, making sure they match what we expect. + daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd + expected_output_files = [ + f"surfdata_TMP_amazon_hist_1850_78pfts_c{daystr}.nc", + f"landuse.timeseries_TMP_amazon_hist_1850-1853_78pfts_c{daystr}.nc", + ] + self.assertTrue(self._check_result_file_matches_expected(expected_output_files, 2)) + + def test_subset_data_pt_landuse_amazon_type360(self): + """ + Test subset_data --create-landuse for Amazon point with longitude type 360 + """ + self._do_test_subset_data_pt_landuse(291) + + def test_subset_data_pt_landuse_amazon_type180(self): + """ + Test subset_data --create-landuse for Amazon point with longitude type 180 + """ + self._do_test_subset_data_pt_landuse(-69) + def _do_test_subset_data_pt_datm(self, lon): """ Given a longitude, test subset_data point --create-datm diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/landuse.timeseries_TMP_amazon_hist_1850-1853_78pfts_c250618.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/landuse.timeseries_TMP_amazon_hist_1850-1853_78pfts_c250618.nc new file mode 100644 index 0000000000..d34fdf3acf --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/landuse.timeseries_TMP_amazon_hist_1850-1853_78pfts_c250618.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b063aeb04ed3a0a613608ecf88ac47efb39de7ba74bf6e33a490925540bf47fb +size 18176 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/surfdata_TMP_amazon_hist_1850_78pfts_c250618.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/surfdata_TMP_amazon_hist_1850_78pfts_c250618.nc new file mode 100644 index 0000000000..02999b6b00 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type180/surfdata_TMP_amazon_hist_1850_78pfts_c250618.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:efbf02729f8741bfdfbd51d748cce31c2d90b0c9ef2f00d841d2940dea5bc144 +size 53256 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type360 b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type360 new file mode 120000 index 0000000000..ad4f251586 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_landuse_amazon_type360 @@ -0,0 +1 @@ +test_subset_data_pt_landuse_amazon_type180 \ No newline at end of file diff --git a/python/ctsm/test/testinputs/landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc b/python/ctsm/test/testinputs/landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc new file mode 100644 index 0000000000..9e81ad351c --- /dev/null +++ b/python/ctsm/test/testinputs/landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:83b34be6da2047bb9a099346f7f5472b932ead7033fe8ab817540b99ff3117b8 +size 215248 diff --git a/python/ctsm/test/testinputs/subset_data_amazon_1850.cfg b/python/ctsm/test/testinputs/subset_data_amazon_1850.cfg new file mode 100644 index 0000000000..6b16160f48 --- /dev/null +++ b/python/ctsm/test/testinputs/subset_data_amazon_1850.cfg @@ -0,0 +1,14 @@ +[surfdat] +dir = ctsm/test/testinputs +surfdat_16pft = surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc +surfdat_78pft = surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc +mesh_dir = ctsm/test/testinputs +mesh_surf = ESMF_mesh_5x5pt_amazon_from_domain_c230308.nc + +[landuse] +dir = ctsm/test/testinputs +landuse_16pft = landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc +landuse_78pft = landuse.timeseries_5x5_amazon_hist_1850-1853_78pfts_c250617.nc + +[domain] +file = ctsm/test/testinputs/domain.lnd.5x5pt-amazon_navy.090715.nc diff --git a/python/ctsm/test/testinputs/surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc b/python/ctsm/test/testinputs/surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc new file mode 100644 index 0000000000..747c33a2b0 --- /dev/null +++ b/python/ctsm/test/testinputs/surfdata_5x5_amazon_hist_1850_78pfts_c250617.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f0795d84b3e07a9437c7e9869810b74002210f7c55349f57983c36db9990db4a +size 893512 From 4b186f10f10c25ad4199e482083176948f2fd397 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 14:49:22 -0600 Subject: [PATCH 49/97] plumber2 scripts: Find csv file from anywhere. --- .../ctsm/site_and_regional/plumber2_shared.py | 25 +++++++++++++++++++ .../plumber2_surf_wrapper.py | 10 ++++++-- .../site_and_regional/plumber2_usermods.py | 10 ++++++-- 3 files changed, 41 insertions(+), 4 deletions(-) create mode 100644 python/ctsm/site_and_regional/plumber2_shared.py diff --git a/python/ctsm/site_and_regional/plumber2_shared.py b/python/ctsm/site_and_regional/plumber2_shared.py new file mode 100644 index 0000000000..70ca2dd800 --- /dev/null +++ b/python/ctsm/site_and_regional/plumber2_shared.py @@ -0,0 +1,25 @@ +""" +Things shared between plumber2 scripts +""" + +import os +import pandas as pd + +PLUMBER2_SITES_CSV = os.path.realpath( + os.path.join( + os.path.dirname(__file__), + os.pardir, + os.pardir, + os.pardir, + "tools", + "site_and_regional", + "PLUMBER2_sites.csv", + ) +) + + +def read_plumber2_sites_csv(): + """ + Read PLUMBER2_sites.csv using pandas + """ + return pd.read_csv(PLUMBER2_SITES_CSV, skiprows=4) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 022914d17e..b10068def8 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -23,10 +23,16 @@ import argparse import logging import os +import sys import subprocess import tqdm -import pandas as pd +# Get the ctsm tools +_CTSM_PYTHON = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "python")) +sys.path.insert(1, _CTSM_PYTHON) + +# pylint:disable=wrong-import-position +from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv def get_parser(): @@ -90,7 +96,7 @@ def main(): if args.verbose: logging.basicConfig(level=logging.DEBUG) - plumber2_sites = pd.read_csv("PLUMBER2_sites.csv", skiprows=4) + plumber2_sites = read_plumber2_sites_csv() for _, row in tqdm.tqdm(plumber2_sites.iterrows()): lat = row["Lat"] diff --git a/python/ctsm/site_and_regional/plumber2_usermods.py b/python/ctsm/site_and_regional/plumber2_usermods.py index 7b7f294a24..6fcd4a6224 100644 --- a/python/ctsm/site_and_regional/plumber2_usermods.py +++ b/python/ctsm/site_and_regional/plumber2_usermods.py @@ -11,9 +11,15 @@ from __future__ import print_function import os +import sys import tqdm -import pandas as pd +# Get the ctsm tools +_CTSM_PYTHON = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "python")) +sys.path.insert(1, _CTSM_PYTHON) + +# pylint:disable=wrong-import-position +from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv # Big ugly function to create usermod_dirs for each site @@ -155,7 +161,7 @@ def main(): """ # For now we can just run the 'main' program as a loop - plumber2_sites = pd.read_csv("PLUMBER2_sites.csv", skiprows=4) + plumber2_sites = read_plumber2_sites_csv() for _, row in tqdm.tqdm(plumber2_sites.iterrows()): lat = row["Lat"] From 0bc1cce4e4145f1c5fdfdf205d586ea452ba8fb2 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 14:57:24 -0600 Subject: [PATCH 50/97] plumber2_surf_wrapper: Call subset_data directly. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index b10068def8..d2520a265f 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -24,7 +24,6 @@ import logging import os import sys -import subprocess import tqdm # Get the ctsm tools @@ -33,6 +32,7 @@ # pylint:disable=wrong-import-position from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv +from ctsm import subset_data def get_parser(): @@ -77,12 +77,10 @@ def execute(command): print("\n", " >> ", *command, "\n") try: - subprocess.check_call(command, stdout=open(os.devnull, "w"), stderr=subprocess.STDOUT) + sys.argv = command + subset_data.main() - except subprocess.CalledProcessError as err: - # raise RuntimeError("command '{}' return with error - # (code {}): {}".format(e.cmd, e.returncode, e.output)) - # print (e.ouput) + except Exception as err: # pylint: disable=broad-exception-caught print(err) From 55b97e6838cd96f405b1c19029208b8e7932450e Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 14:57:44 -0600 Subject: [PATCH 51/97] plumber2_surf_wrapper: Specify --lon-type 180. --- python/ctsm/site_and_regional/plumber2_shared.py | 2 +- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 4 ++++ tools/site_and_regional/PLUMBER2_sites.csv | 1 + 3 files changed, 6 insertions(+), 1 deletion(-) diff --git a/python/ctsm/site_and_regional/plumber2_shared.py b/python/ctsm/site_and_regional/plumber2_shared.py index 70ca2dd800..d2232d0b2d 100644 --- a/python/ctsm/site_and_regional/plumber2_shared.py +++ b/python/ctsm/site_and_regional/plumber2_shared.py @@ -22,4 +22,4 @@ def read_plumber2_sites_csv(): """ Read PLUMBER2_sites.csv using pandas """ - return pd.read_csv(PLUMBER2_SITES_CSV, skiprows=4) + return pd.read_csv(PLUMBER2_SITES_CSV, skiprows=5) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index d2520a265f..6b04b70a66 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -152,6 +152,8 @@ def main(): "--cap-saturation", "--verbose", "--overwrite", + "--lon-type", + "180", ] else: # use surface dataset with 78 pfts, and overwrite to 100% 1 dominant PFT @@ -179,6 +181,8 @@ def main(): "--cap-saturation", "--verbose", "--overwrite", + "--lon-type", + "180", ] execute(subset_command) diff --git a/tools/site_and_regional/PLUMBER2_sites.csv b/tools/site_and_regional/PLUMBER2_sites.csv index f252fa1d61..8ace57ad7c 100644 --- a/tools/site_and_regional/PLUMBER2_sites.csv +++ b/tools/site_and_regional/PLUMBER2_sites.csv @@ -2,6 +2,7 @@ #start_year and end_year will be used to define DATM_YR_ALIGH, DATM_YR_START and DATM_YR_END, and STOP_N in units of nyears. #RUN_STARTDATE and START_TOD are specified because we are starting at GMT corresponding to local midnight. #ATM_NCPL is specified so that the time step of the model matches the time interval specified by the atm forcing data. +#longitudes must be in the range [-180,180] ,Site,Lat,Lon,pft1,pft1-%,pft1-cth,pft1-cbh,pft2,pft2-%,pft2-cth,pft2-cbh,start_year,end_year,RUN_STARTDATE,START_TOD,ATM_NCPL 1,AR-SLu,-33.464802,-66.459808,5,50.00, 4.50, 0.13,7,50.00, 4.50, 2.59,2010,2010,2010-01-01,10800,48 2,AT-Neu,47.116669,11.317500,13,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2002,2012,2001-12-31,82800,48 From 44d461d6b13751feb7e724ac0f8c7b177d362b41 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 15:03:05 -0600 Subject: [PATCH 52/97] plumber2_surf_wrapper: Respect --verbose. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 6b04b70a66..c494654334 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -51,7 +51,6 @@ def get_parser(): help="Verbose mode will print more information. ", action="store_true", dest="verbose", - default=False, ) parser.add_argument( @@ -150,7 +149,6 @@ def main(): "--create-surface", "--uniform-snowpack", "--cap-saturation", - "--verbose", "--overwrite", "--lon-type", "180", @@ -179,11 +177,14 @@ def main(): "--create-surface", "--uniform-snowpack", "--cap-saturation", - "--verbose", "--overwrite", "--lon-type", "180", ] + + if args.verbose: + subset_command += ["--verbose"] + execute(subset_command) From 6d96107b1a768abdb435d87739515676d6c24962 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 15:04:34 -0600 Subject: [PATCH 53/97] plumber2_surf_wrapper: Stop on errors. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index c494654334..9352eedb60 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -75,12 +75,8 @@ def execute(command): """ print("\n", " >> ", *command, "\n") - try: - sys.argv = command - subset_data.main() - - except Exception as err: # pylint: disable=broad-exception-caught - print(err) + sys.argv = command + subset_data.main() def main(): From 8781571ba1e33902edf424598155f7cbe0044e38 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 15:10:25 -0600 Subject: [PATCH 54/97] plumber2_surf_wrapper: Add optional --plumber2-sites-csv argument. --- python/ctsm/site_and_regional/plumber2_shared.py | 4 ++-- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 10 ++++++++-- 2 files changed, 10 insertions(+), 4 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_shared.py b/python/ctsm/site_and_regional/plumber2_shared.py index d2232d0b2d..491b35e7d2 100644 --- a/python/ctsm/site_and_regional/plumber2_shared.py +++ b/python/ctsm/site_and_regional/plumber2_shared.py @@ -18,8 +18,8 @@ ) -def read_plumber2_sites_csv(): +def read_plumber2_sites_csv(file=PLUMBER2_SITES_CSV): """ Read PLUMBER2_sites.csv using pandas """ - return pd.read_csv(PLUMBER2_SITES_CSV, skiprows=5) + return pd.read_csv(file, skiprows=5) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 9352eedb60..c0c287b356 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -31,7 +31,7 @@ sys.path.insert(1, _CTSM_PYTHON) # pylint:disable=wrong-import-position -from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv +from ctsm.site_and_regional.plumber2_shared import PLUMBER2_SITES_CSV, read_plumber2_sites_csv from ctsm import subset_data @@ -61,6 +61,12 @@ def get_parser(): default=True, ) + parser.add_argument( + "--plumber2-sites-csv", + help=f"Comma-separated value (CSV) file with Plumber2 sites. Default: {PLUMBER2_SITES_CSV}", + default=PLUMBER2_SITES_CSV, + ) + return parser @@ -89,7 +95,7 @@ def main(): if args.verbose: logging.basicConfig(level=logging.DEBUG) - plumber2_sites = read_plumber2_sites_csv() + plumber2_sites = read_plumber2_sites_csv(args.plumber2_sites_csv) for _, row in tqdm.tqdm(plumber2_sites.iterrows()): lat = row["Lat"] From 6943fe4055a9f021f8059f0f6ca7d2fcb6eb470d Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 16:04:17 -0600 Subject: [PATCH 55/97] plumber2_surf_wrapper: Fix handling of one-PFT sites. Resolves ESCOMP/CTSM#3262. --- .../plumber2_surf_wrapper.py | 134 +++++++++--------- .../site_and_regional/single_point_case.py | 3 +- tools/site_and_regional/PLUMBER2_sites.csv | 10 +- 3 files changed, 77 insertions(+), 70 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index c0c287b356..c97f6772ab 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -85,6 +85,13 @@ def execute(command): subset_data.main() +def is_valid_pft(pft_num): + """ + Given a number, check whether it represents a valid PFT + """ + return pft_num >= 1 + + def main(): """ Read plumber2_sites from csv, iterate through sites, and add dominant PFT @@ -101,89 +108,88 @@ def main(): lat = row["Lat"] lon = row["Lon"] site = row["Site"] + + clmsite = "1x1_PLUMBER2_" + site + print("Now processing site :", site) + + # Set up part of subset_data command that is shared among all options + subset_command = [ + "./subset_data", + "point", + "--lat", + str(lat), + "--lon", + str(lon), + "--site", + clmsite, + "--create-surface", + "--uniform-snowpack", + "--cap-saturation", + "--overwrite", + "--lon-type", + "180", + ] + + # Read info for first PFT pft1 = row["pft1"] + if not is_valid_pft(pft1): + raise RuntimeError(f"pft1 must be a valid PFT; got {pft1}") pctpft1 = row["pft1-%"] cth1 = row["pft1-cth"] cbh1 = row["pft1-cbh"] - pft2 = row["pft2"] - pctpft2 = row["pft2-%"] - cth2 = row["pft2-cth"] - cbh2 = row["pft2-cbh"] - # overwrite missing values from .csv file - if pft1 == -999: - pft1 = 0 - pctpft1 = 0 - cth1 = 0 - cbh1 = 0 - if pft2 == -999: - pft2 = 0 - pctpft2 = 0 - cth2 = 0 - cbh2 = 0 - clmsite = "1x1_PLUMBER2_" + site - print("Now processing site :", site) - if args.pft_16: - # use surface dataset with 16 pfts, but overwrite to 100% 1 dominant PFT - # don't set crop flag - # set dominant pft - subset_command = [ - "./subset_data", - "point", - "--lat", - str(lat), - "--lon", - str(lon), - "--site", - clmsite, + # Read info for second PFT, if a valid one is given in the .csv file + pft2 = row["pft2"] + if is_valid_pft(pft2): + pctpft2 = row["pft2-%"] + cth2 = row["pft2-cth"] + cbh2 = row["pft2-cbh"] + + # Set dominant PFT(s) + if is_valid_pft(pft2): + subset_command += [ "--dompft", str(pft1), str(pft2), "--pctpft", str(pctpft1), str(pctpft2), - "--cth", - str(cth1), - str(cth2), - "--cbh", - str(cbh1), - str(cbh2), - "--create-surface", - "--uniform-snowpack", - "--cap-saturation", - "--overwrite", - "--lon-type", - "180", ] else: - # use surface dataset with 78 pfts, and overwrite to 100% 1 dominant PFT - # NOTE: FATES will currently not run with a 78-PFT surface dataset - # set crop flag - # set dominant pft - subset_command = [ - "./subset_data", - "point", - "--lat", - str(lat), - "--lon", - str(lon), - "--site", - clmsite, - "--crop", + subset_command += [ "--dompft", str(pft1), - str(pft2), "--pctpft", str(pctpft1), - str(pctpft2), - "--create-surface", - "--uniform-snowpack", - "--cap-saturation", - "--overwrite", - "--lon-type", - "180", ] + if args.pft_16: + # use surface dataset with 16 pfts, but overwrite to 100% 1 dominant PFT + # don't set crop flag + # set canopy top and bottom heights + if is_valid_pft(pft2): + subset_command += [ + "--cth", + str(cth1), + str(cth2), + "--cbh", + str(cbh1), + str(cbh2), + ] + else: + subset_command += [ + "--cth", + str(cth1), + "--cbh", + str(cbh1), + ] + else: + # use surface dataset with 78 pfts, and overwrite to 100% 1 dominant PFT + # NOTE: FATES will currently not run with a 78-PFT surface dataset + # set crop flag + subset_command += ["--crop"] + # don't set canopy top and bottom heights + if args.verbose: subset_command += ["--verbose"] diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index ed8b4b5562..55813a423a 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -237,7 +237,8 @@ def check_dom_pft(self): if min_dom_pft < NAT_PFT <= max_dom_pft: err_msg = ( "You are subsetting using mixed land units that have both " - "natural pfts and crop cfts. Check your surface dataset. " + "natural pfts and crop cfts. Check your surface dataset.\n" + f"{min_dom_pft} < {NAT_PFT} <= {max_dom_pft}\n" ) raise argparse.ArgumentTypeError(err_msg) diff --git a/tools/site_and_regional/PLUMBER2_sites.csv b/tools/site_and_regional/PLUMBER2_sites.csv index 8ace57ad7c..1097568051 100644 --- a/tools/site_and_regional/PLUMBER2_sites.csv +++ b/tools/site_and_regional/PLUMBER2_sites.csv @@ -74,7 +74,7 @@ 68,DK-Sor,55.485870,11.644640,7,100.00,25.00,14.37,-999,-999.00,-999.00,-999.00,1997,2014,1996-12-31,82800,48 69,DK-ZaH,74.473282,-20.550293,12,100.00, 0.47, 0.01,-999,-999.00,-999.00,-999.00,2000,2013,2000-01-01,0,48 70,ES-ES1,39.345970,-0.318817,1,100.00, 7.50, 3.75,-999,-999.00,-999.00,-999.00,1999,2006,1998-12-31,82800,48 -71,ES-ES2,39.275558,-0.315277,-999,-999.00,-999.00,-999.00,16,100.00, 0.50, 0.01,2005,2006,2004-12-31,82800,48 +71,ES-ES2,39.275558,-0.315277,16,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2005,2006,2004-12-31,82800,48 72,ES-LgS,37.097935,-2.965820,10,30.00, 0.20, 0.04,13,70.00, 0.50, 0.01,2007,2007,2006-12-31,82800,48 73,ES-LMa,39.941502,-5.773346,7,30.00, 8.00, 4.60,14,70.00, 0.50, 0.01,2004,2006,2003-12-31,82800,48 74,ES-VDA,42.152180, 1.448500,7,30.00, 0.50, 0.29,13,70.00, 0.50, 0.01,2004,2004,2003-12-31,82800,48 @@ -95,7 +95,7 @@ 89,IE-Ca1,52.858791,-6.918152,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2004,2006,2004-01-01,0,48 90,IE-Dri,51.986691,-8.751801,13,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2003,2005,2003-01-01,0,48 91,IT-Amp,41.904099,13.605160,13,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2003,2006,2002-12-31,82800,48 -92,IT-BCi,40.523800,14.957440,-999,-999.00,-999.00,-999.00,16,100.00, 0.50, 0.01,2005,2010,2004-12-31,82800,48 +92,IT-BCi,40.523800,14.957440,16,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2005,2010,2004-12-31,82800,48 93,IT-CA1,42.380409,12.026560,7,100.00, 5.50, 3.16,-999,-999.00,-999.00,-999.00,2012,2013,2011-12-31,82800,48 94,IT-CA2,42.377220,12.026040,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2012,2013,2011-12-31,82800,48 95,IT-CA3,42.380001,12.022200,7,100.00, 3.50, 2.01,-999,-999.00,-999.00,-999.00,2012,2013,2011-12-31,82800,48 @@ -152,8 +152,8 @@ 146,US-MMS,39.323200,-86.413086,7,100.00,27.00,15.52,-999,-999.00,-999.00,-999.00,1999,2014,1999-01-01,18000,24 147,US-MOz,38.744110,-92.200012,7,100.00,24.00,13.80,-999,-999.00,-999.00,-999.00,2005,2006,2005-01-01,21600,48 148,US-Myb,38.049801,-121.765106,13,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2011,2014,2011-01-01,28800,48 -149,US-Ne1,41.165100,-96.476593,-999,-999.00,-999.00,-999.00,16,100.00, 0.50, 0.01,2002,2012,2002-01-01,21600,24 -150,US-Ne2,41.164902,-96.470093,-999,-999.00,-999.00,-999.00,16,100.00, 0.50, 0.01,2002,2012,2002-01-01,21600,24 +149,US-Ne1,41.165100,-96.476593,16,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2002,2012,2002-01-01,21600,24 +150,US-Ne2,41.164902,-96.470093,16,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2002,2012,2002-01-01,21600,24 151,US-Ne3,41.179699,-96.439697,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2002,2012,2002-01-01,21600,24 152,US-NR1,40.032902,-105.546402,1,100.00,12.00, 6.00,-999,-999.00,-999.00,-999.00,1999,2014,1999-01-01,25200,48 153,US-PFa,45.945900,-90.272308,1, 8.18,30.00,15.00,7,91.82,30.00,17.25,1995,2014,1995-01-01,21600,24 @@ -166,7 +166,7 @@ 160,US-Syv,46.242001,-89.347717,1, 4.91,27.00,13.50,7,95.09,27.00,15.53,2002,2008,2002-01-01,21600,48 161,US-Ton,38.431599,-120.966003,7,70.00, 7.10, 4.08,14,30.00, 0.50, 0.01,2001,2014,2001-01-01,28800,48 162,US-Tw4,38.103001,-121.641403,13,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2014,2014,2014-01-01,28800,48 -163,US-Twt,38.108700,-121.653107,-999,-999.00,-999.00,-999.00,16,100.00, 0.50, 0.01,2010,2014,2010-01-01,28800,48 +163,US-Twt,38.108700,-121.653107,16,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2010,2014,2010-01-01,28800,48 164,US-UMB,45.559799,-84.713806,7,100.00,20.00,11.50,-999,-999.00,-999.00,-999.00,2000,2014,2000-01-01,18000,24 165,US-Var,38.413300,-120.950729,14,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2001,2014,2001-01-01,28800,48 166,US-WCr,45.805901,-90.079895,7,100.00,24.00,13.80,-999,-999.00,-999.00,-999.00,1999,2006,1999-01-01,21600,48 From d2c711825570bf2c8fcb546b6684badbab2b8f13 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Wed, 18 Jun 2025 16:43:07 -0600 Subject: [PATCH 56/97] Add Python system test for plumber2_surf_wrapper. --- .../test/test_sys_plumber2_surf_wrapper.py | 70 +++++++++++++++++++ 1 file changed, 70 insertions(+) create mode 100755 python/ctsm/test/test_sys_plumber2_surf_wrapper.py diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py new file mode 100755 index 0000000000..4bd2a6763a --- /dev/null +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -0,0 +1,70 @@ +#!/usr/bin/env python3 + +"""System tests for plumber2_surf_wrapper""" + +import glob +import os +import unittest +import tempfile +import shutil +import sys + +from ctsm import unit_testing +from ctsm.site_and_regional.plumber2_surf_wrapper import main +from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv +from ctsm.path_utils import path_to_ctsm_root + +# Allow test names that pylint doesn't like; otherwise hard to make them +# readable +# pylint: disable=invalid-name + + +class TestSysPlumber2SurfWrapper(unittest.TestCase): + """ + System tests for plumber2_surf_wrapper + """ + + def setUp(self): + """ + Make tempdir for use by these tests. + """ + self._previous_dir = os.getcwd() + self._tempdir = tempfile.mkdtemp() + os.chdir(self._tempdir) # cd to tempdir + + def tearDown(self): + """ + Remove temporary directory + """ + os.chdir(self._previous_dir) + shutil.rmtree(self._tempdir, ignore_errors=True) + + def test_plumber2_surf_wrapper(self): + """ + Run the entire tool + """ + + tool_path = os.path.join( + path_to_ctsm_root(), + "tools", + "site_and_regional", + "plumber2_surf_wrapper", + ) + sys.argv = [tool_path] + main() + + # How many files do we expect? + plumber2_csv = read_plumber2_sites_csv() + n_files_expected = len(plumber2_csv) + + # How many files did we get? + file_list = os.listdir("subset_data_single_point") + n_files = len(file_list) + + # Check + self.assertEqual(n_files_expected, n_files) + + +if __name__ == "__main__": + unit_testing.setup_for_tests() + unittest.main() From 132d6f8c452255bb31c987c97dde7ef235f50ef2 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 15:28:58 -0600 Subject: [PATCH 57/97] Remove an unused import. --- python/ctsm/test/test_sys_plumber2_surf_wrapper.py | 1 - 1 file changed, 1 deletion(-) diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index 4bd2a6763a..a35af29ad2 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -2,7 +2,6 @@ """System tests for plumber2_surf_wrapper""" -import glob import os import unittest import tempfile From 194ce4397797fa15d937b09925a2c1007121155e Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 15:46:04 -0600 Subject: [PATCH 58/97] Test plumber2_surf_wrapper invalid-PFT error. --- .../test/test_sys_plumber2_surf_wrapper.py | 34 +++++++++++++++---- .../PLUMBER2_sites_invalid_pft.csv | 8 +++++ 2 files changed, 35 insertions(+), 7 deletions(-) create mode 100644 python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index a35af29ad2..4cb34aa89a 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -31,6 +31,19 @@ def setUp(self): self._tempdir = tempfile.mkdtemp() os.chdir(self._tempdir) # cd to tempdir + # Path to script + self.tool_path = os.path.join( + path_to_ctsm_root(), + "tools", + "site_and_regional", + "plumber2_surf_wrapper", + ) + + # Path to test inputs directory + self.test_inputs = os.path.join( + os.path.dirname(__file__), "testinputs", "plumber2_surf_wrapper" + ) + def tearDown(self): """ Remove temporary directory @@ -43,13 +56,7 @@ def test_plumber2_surf_wrapper(self): Run the entire tool """ - tool_path = os.path.join( - path_to_ctsm_root(), - "tools", - "site_and_regional", - "plumber2_surf_wrapper", - ) - sys.argv = [tool_path] + sys.argv = [self.tool_path] main() # How many files do we expect? @@ -63,6 +70,19 @@ def test_plumber2_surf_wrapper(self): # Check self.assertEqual(n_files_expected, n_files) + def test_plumber2_surf_wrapper_invalid_pft(self): + """ + plumber2_surf_wrapper should error if invalid PFT is given + """ + + sys.argv = [ + self.tool_path, + "--plumber2-sites-csv", + os.path.join(self.test_inputs, "PLUMBER2_sites_invalid_pft.csv"), + ] + with self.assertRaisesRegex(RuntimeError, "must be a valid PFT"): + main() + if __name__ == "__main__": unit_testing.setup_for_tests() diff --git a/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv new file mode 100644 index 0000000000..2d4b7dcb57 --- /dev/null +++ b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv @@ -0,0 +1,8 @@ +#pftX-cth and pftX-cbh are the site=specific canopy top and bottom heights +#start_year and end_year will be used to define DATM_YR_ALIGH, DATM_YR_START and DATM_YR_END, and STOP_N in units of nyears. +#RUN_STARTDATE and START_TOD are specified because we are starting at GMT corresponding to local midnight. +#ATM_NCPL is specified so that the time step of the model matches the time interval specified by the atm forcing data. +#longitudes must be in the range [-180,180] +,Site,Lat,Lon,pft1,pft1-%,pft1-cth,pft1-cbh,pft2,pft2-%,pft2-cth,pft2-cbh,start_year,end_year,RUN_STARTDATE,START_TOD,ATM_NCPL +26,Invalid-Pft,51.309166, 4.520560,0,19.22,21.00,10.50,7,80.78,21.00,12.08,2004,2014,2003-12-31,82800,48 +27,BE-Lon,50.551590, 4.746130,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2005,2014,2004-12-31,82800,48 From b9ed339e14e70a98e2efa80d2a2c0fa5257125a3 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 16:10:54 -0600 Subject: [PATCH 59/97] Replace plumber2_surf_wrapper args test with useful ones. Failing. --- .../plumber2_surf_wrapper.py | 8 +-- .../test/test_unit_plumber2_surf_wrapper.py | 56 +++++++++++++++++-- 2 files changed, 56 insertions(+), 8 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index c97f6772ab..46cdacf3ef 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -35,9 +35,9 @@ from ctsm import subset_data -def get_parser(): +def get_args(): """ - Get parser object for this script. + Get arguments for this script. """ parser = argparse.ArgumentParser( description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter @@ -67,7 +67,7 @@ def get_parser(): default=PLUMBER2_SITES_CSV, ) - return parser + return parser.parse_args() def execute(command): @@ -97,7 +97,7 @@ def main(): Read plumber2_sites from csv, iterate through sites, and add dominant PFT """ - args = get_parser().parse_args() + args = get_args() if args.verbose: logging.basicConfig(level=logging.DEBUG) diff --git a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py index 66f5578caa..4e23fdd209 100755 --- a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py @@ -16,7 +16,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing -from ctsm.site_and_regional.plumber2_surf_wrapper import get_parser +from ctsm.site_and_regional.plumber2_surf_wrapper import get_args # pylint: disable=invalid-name @@ -26,12 +26,60 @@ class TestPlumber2SurfWrapper(unittest.TestCase): Basic class for testing plumber2_surf_wrapper.py. """ - def test_parser(self): + def setUp(self): + sys.argv = ["subset_data"] # Could actually be anything + + def test_parser_default_csv_exists(self): + """ + Test that default PLUMBER2 sites CSV file exists + """ + + args = get_args() + self.assertTrue(os.path.exists(args.plumber2_sites_csv)) + + def test_parser_custom_csv(self): + """ + Test that script accepts custom CSV file path + """ + + custom_path = "path/to/custom.csv" + sys.argv += ["--plumber2-sites-csv", custom_path] + args = get_args() + self.assertEqual(args.plumber2_sites_csv, custom_path) + + def test_parser_verbose_false_default(self): + """ + Test that script is not verbose by default + """ + + args = get_args() + self.assertFalse(args.verbose) + + def test_parser_verbose_true(self): + """ + Test that --verbose sets verbose to True + """ + + sys.argv += ["--verbose"] + args = get_args() + self.assertTrue(args.verbose) + + def test_parser_16pft_false_default(self): + """ + Test that script does not use 16pft mode by default + """ + + args = get_args() + self.assertFalse(args.pft_16) + + def test_parser_16pft_true(self): """ - Test that parser has same defaults as expected + Test that --16pft sets pft_16 to True """ - self.assertEqual(get_parser().argument_default, None, "Parser not working as expected") + sys.argv += ["--16pft"] + args = get_args() + self.assertTrue(args.pft_16) if __name__ == "__main__": From 3019c4eba0969dd7880de32e625a774cbf7fdfef Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 16:12:41 -0600 Subject: [PATCH 60/97] plumber2_surf_wrapper: Respect user not saying --16pft. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 1 - 1 file changed, 1 deletion(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 46cdacf3ef..4a93859afe 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -58,7 +58,6 @@ def get_args(): help="Create and/or modify 16-PFT surface datasets (e.g. for a FATES run) ", action="store_true", dest="pft_16", - default=True, ) parser.add_argument( From 4707843eb321f83c87806d1e4d427da1ef694e57 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 16:16:34 -0600 Subject: [PATCH 61/97] plumber2_surf_wrapper: Test full run with --16pft. --- .../test/test_sys_plumber2_surf_wrapper.py | 23 ++++++++++++++++++- 1 file changed, 22 insertions(+), 1 deletion(-) diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index 4cb34aa89a..e405d487be 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -53,7 +53,8 @@ def tearDown(self): def test_plumber2_surf_wrapper(self): """ - Run the entire tool + Run the entire tool with default settings. + CAN ONLY RUN ON SYSTEMS WITH INPUTDATA """ sys.argv = [self.tool_path] @@ -70,6 +71,26 @@ def test_plumber2_surf_wrapper(self): # Check self.assertEqual(n_files_expected, n_files) + def test_plumber2_surf_wrapper_16pft(self): + """ + Run the entire tool with --16pft. + CAN ONLY RUN ON SYSTEMS WITH INPUTDATA + """ + + sys.argv = [self.tool_path, "--16pft"] + main() + + # How many files do we expect? + plumber2_csv = read_plumber2_sites_csv() + n_files_expected = len(plumber2_csv) + + # How many files did we get? + file_list = os.listdir("subset_data_single_point") + n_files = len(file_list) + + # Check + self.assertEqual(n_files_expected, n_files) + def test_plumber2_surf_wrapper_invalid_pft(self): """ plumber2_surf_wrapper should error if invalid PFT is given From 30f455806020d5e085bbff743ecc2d20f442f4e8 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 16:36:21 -0600 Subject: [PATCH 62/97] plumber2_surf_wrapper: Add --overwrite option. --- .../plumber2_surf_wrapper.py | 9 ++++- .../test/test_sys_plumber2_surf_wrapper.py | 37 +++++++++++++++++++ .../PLUMBER2_site_valid.csv | 7 ++++ 3 files changed, 52 insertions(+), 1 deletion(-) create mode 100644 python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_site_valid.csv diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 4a93859afe..4fd8436982 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -60,6 +60,12 @@ def get_args(): dest="pft_16", ) + parser.add_argument( + "--overwrite", + help="Overwrite any existing files", + action="store_true", + ) + parser.add_argument( "--plumber2-sites-csv", help=f"Comma-separated value (CSV) file with Plumber2 sites. Default: {PLUMBER2_SITES_CSV}", @@ -124,7 +130,6 @@ def main(): "--create-surface", "--uniform-snowpack", "--cap-saturation", - "--overwrite", "--lon-type", "180", ] @@ -191,6 +196,8 @@ def main(): if args.verbose: subset_command += ["--verbose"] + if args.overwrite: + subset_command += ["--overwrite"] execute(subset_command) diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index e405d487be..de20fd11c3 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -104,6 +104,43 @@ def test_plumber2_surf_wrapper_invalid_pft(self): with self.assertRaisesRegex(RuntimeError, "must be a valid PFT"): main() + def test_plumber2_surf_wrapper_existing_no_overwrite_fails(self): + """ + plumber2_surf_wrapper should fail if file exists but --overwrite isn't given + """ + + sys_argv_shared = [ + self.tool_path, + "--plumber2-sites-csv", + os.path.join(self.test_inputs, "PLUMBER2_site_valid.csv"), + ] + + # Run twice, expecting second to fail + sys.argv = sys_argv_shared + main() + sys.argv = sys_argv_shared + with self.assertRaisesRegex(SystemExit, "exists"): + main() + + def test_plumber2_surf_wrapper_existing_overwrite_passes(self): + """ + plumber2_surf_wrapper should pass if file exists and --overwrite is given + """ + + sys_argv_shared = [ + self.tool_path, + "--plumber2-sites-csv", + os.path.join(self.test_inputs, "PLUMBER2_site_valid.csv"), + ] + + # Run once to generate the files + sys.argv = sys_argv_shared + main() + + # Run again with --overwrite, expecting pass + sys.argv = sys_argv_shared + ["--overwrite"] + main() + if __name__ == "__main__": unit_testing.setup_for_tests() diff --git a/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_site_valid.csv b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_site_valid.csv new file mode 100644 index 0000000000..2c1580bc03 --- /dev/null +++ b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_site_valid.csv @@ -0,0 +1,7 @@ +#pftX-cth and pftX-cbh are the site=specific canopy top and bottom heights +#start_year and end_year will be used to define DATM_YR_ALIGH, DATM_YR_START and DATM_YR_END, and STOP_N in units of nyears. +#RUN_STARTDATE and START_TOD are specified because we are starting at GMT corresponding to local midnight. +#ATM_NCPL is specified so that the time step of the model matches the time interval specified by the atm forcing data. +#longitudes must be in the range [-180,180] +,Site,Lat,Lon,pft1,pft1-%,pft1-cth,pft1-cbh,pft2,pft2-%,pft2-cth,pft2-cbh,start_year,end_year,RUN_STARTDATE,START_TOD,ATM_NCPL +27,BE-Lon,50.551590, 4.746130,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2005,2014,2004-12-31,82800,48 From 1c5c1e72944f0195790c1b3932869848c0bcec98 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 16:38:38 -0600 Subject: [PATCH 63/97] plumber2_surf_wrapper: Improve execute() comments. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 4fd8436982..2162a3830f 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -77,12 +77,12 @@ def get_args(): def execute(command): """ - Function for running a command on shell. + Runs subset_data with given arguments. Args: - command (str): - command that we want to run. + command (list): + list of args for command that we want to run. Raises: - Error with the return code from shell. + Whatever error subset_data gives, if any. """ print("\n", " >> ", *command, "\n") From 63d61f43157710e7376ce72c4217a3c51313c087 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Thu, 19 Jun 2025 17:00:42 -0600 Subject: [PATCH 64/97] plumber2_surf_wrapper: Switch --16pft to --78pft to preserve previous default. --- .../site_and_regional/plumber2_surf_wrapper.py | 10 +++++----- python/ctsm/test/test_sys_plumber2_surf_wrapper.py | 6 +++--- .../ctsm/test/test_unit_plumber2_surf_wrapper.py | 14 +++++++------- 3 files changed, 15 insertions(+), 15 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 2162a3830f..367fa97a77 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -54,10 +54,10 @@ def get_args(): ) parser.add_argument( - "--16pft", - help="Create and/or modify 16-PFT surface datasets (e.g. for a FATES run) ", + "--78pft", + help="Create and/or modify 78-PFT surface datasets (e.g. for a non-FATES run) ", action="store_true", - dest="pft_16", + dest="pft_78", ) parser.add_argument( @@ -167,8 +167,8 @@ def main(): str(pctpft1), ] - if args.pft_16: - # use surface dataset with 16 pfts, but overwrite to 100% 1 dominant PFT + if not args.pft_78: + # use surface dataset with 78 pfts, but overwrite to 100% 1 dominant PFT # don't set crop flag # set canopy top and bottom heights if is_valid_pft(pft2): diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index de20fd11c3..a7dcf12821 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -71,13 +71,13 @@ def test_plumber2_surf_wrapper(self): # Check self.assertEqual(n_files_expected, n_files) - def test_plumber2_surf_wrapper_16pft(self): + def test_plumber2_surf_wrapper_78pft(self): """ - Run the entire tool with --16pft. + Run the entire tool with --78pft. CAN ONLY RUN ON SYSTEMS WITH INPUTDATA """ - sys.argv = [self.tool_path, "--16pft"] + sys.argv = [self.tool_path, "--78pft"] main() # How many files do we expect? diff --git a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py index 4e23fdd209..e3e677a8a6 100755 --- a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py @@ -64,22 +64,22 @@ def test_parser_verbose_true(self): args = get_args() self.assertTrue(args.verbose) - def test_parser_16pft_false_default(self): + def test_parser_78pft_false_default(self): """ - Test that script does not use 16pft mode by default + Test that script does not use 78pft mode by default """ args = get_args() - self.assertFalse(args.pft_16) + self.assertFalse(args.pft_78) - def test_parser_16pft_true(self): + def test_parser_78pft_true(self): """ - Test that --16pft sets pft_16 to True + Test that --78pft sets pft_78 to True """ - sys.argv += ["--16pft"] + sys.argv += ["--78pft"] args = get_args() - self.assertTrue(args.pft_16) + self.assertTrue(args.pft_78) if __name__ == "__main__": From 7245f21ec83a26b0201809f86c02b4fc71b181b1 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 15:09:04 -0600 Subject: [PATCH 65/97] test_sys_subset_data: pt datm tests now compare vs. expected. --- python/ctsm/test/test_sys_subset_data.py | 11 +++-------- .../clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1986.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1987.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1988.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1986.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1987.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1988.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1986.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1987.nc | 3 +++ .../clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1988.nc | 3 +++ .../domain.crujra_v2.3_0.5x0.5_TMP_c250620.nc | 3 +++ .../test_subset_data_pt_datm_amazon_type360 | 1 + 12 files changed, 34 insertions(+), 8 deletions(-) create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1986.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1987.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1988.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1986.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1987.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1988.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1986.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1987.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1988.nc create mode 100644 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/domain.crujra_v2.3_0.5x0.5_TMP_c250620.nc create mode 120000 python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type360 diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 42377f5b00..2c47919dae 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -332,7 +332,7 @@ def _do_test_subset_data_pt_datm(self, lon): ] subset_data.main() - # Loop through all the output files, making sure they exist. + # Loop through all the output files, making sure they match what we expect. daystr = "[0-9][0-9][0-9][0-9][0-9][0-9]" # 6-digit day code, yymmdd expected_output_files = [ f"domain.crujra_v2.3_0.5x0.5_{sitename}_c{daystr}.nc", @@ -342,13 +342,8 @@ def _do_test_subset_data_pt_datm(self, lon): expected_output_files.append( f"clmforc.CRUJRAv2.5_0.5x0.5.{forcing}.{sitename}.{year}.nc" ) - for file_basename in expected_output_files: - file_path = os.path.join(outdir, "datmdata", file_basename) - # The below will error if exactly one matching file isn't found - try: - find_one_file_matching_pattern(file_path) - except Exception as e: - raise AssertionError(str(e)) from e + expected_output_files = [os.path.join("datmdata", x) for x in expected_output_files] + self.assertTrue(self._check_result_file_matches_expected(expected_output_files, 2)) def test_subset_data_pt_datm_amazon_type360(self): """ diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1986.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1986.nc new file mode 100644 index 0000000000..84da04d260 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1986.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8e1075a199de0d85b974bd9dbd09216e460eda035b3a6652cbfc59b75829e3ee +size 13136 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1987.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1987.nc new file mode 100644 index 0000000000..f05b8eb442 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1987.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:21fb1ae2b2e75336e409770988dabd80e9ee69d990e5aa63dc7008c9145a455f +size 13136 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1988.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1988.nc new file mode 100644 index 0000000000..3d521c66f4 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Prec.TMP.1988.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f8e2624c686c86d5d1071ed618b564e4589731555836083cad0a1e8259b7962e +size 13136 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1986.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1986.nc new file mode 100644 index 0000000000..1d551867f0 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1986.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8b4164da71cf6bdf351143b936d2ad84da0c943378cb54534ec46f03513e2d17 +size 13144 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1987.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1987.nc new file mode 100644 index 0000000000..b752309969 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1987.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:93e9ab5686acc5fb7ddaf775e7f561d572a4fbecab28088b643868432e3d1ed3 +size 13144 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1988.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1988.nc new file mode 100644 index 0000000000..c3c47b61be --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.Solr.TMP.1988.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:fbb6d1679040959e540928b7df056a848e9a385441d725f5f84271a07c64889c +size 13144 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1986.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1986.nc new file mode 100644 index 0000000000..9be8249601 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1986.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7145768c96bdf8b3cbab234b2a09c4506916dbbc8db9fbc73282d643251ed318 +size 37324 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1987.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1987.nc new file mode 100644 index 0000000000..068a7ff28e --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1987.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d1e38846646d2514671bd340daa0954bf1981aa328d4923cb42044097bb77f38 +size 37324 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1988.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1988.nc new file mode 100644 index 0000000000..1b7094dbee --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/clmforc.CRUJRAv2.5_0.5x0.5.TPQWL.TMP.1988.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:395aa495fd3b926521cd355fd2a012cdcd07d19b7a00467fdc49dafbf80751a1 +size 37324 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/domain.crujra_v2.3_0.5x0.5_TMP_c250620.nc b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/domain.crujra_v2.3_0.5x0.5_TMP_c250620.nc new file mode 100644 index 0000000000..c9b19f474b --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type180/datmdata/domain.crujra_v2.3_0.5x0.5_TMP_c250620.nc @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:206ba64ca50dbd3b34e93f498eb1f526689e3a6900762f12e30c3af9b75ccb5c +size 2000 diff --git a/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type360 b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type360 new file mode 120000 index 0000000000..88385bbff2 --- /dev/null +++ b/python/ctsm/test/testinputs/expected_result_files/test_subset_data_pt_datm_amazon_type360 @@ -0,0 +1 @@ +test_subset_data_pt_datm_amazon_type180 \ No newline at end of file From 4f92b9db1bc19b2a5eae1ea804200a42b8d292c9 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 15:11:36 -0600 Subject: [PATCH 66/97] _detect_lon_type() is public, so remove leading _. --- python/ctsm/longitude.py | 2 +- .../ctsm/site_and_regional/regional_case.py | 4 +-- .../site_and_regional/single_point_case.py | 4 +-- python/ctsm/subset_data.py | 8 +++--- python/ctsm/test/test_unit_longitude.py | 26 +++++++++---------- 5 files changed, 22 insertions(+), 22 deletions(-) diff --git a/python/ctsm/longitude.py b/python/ctsm/longitude.py index 8afa731131..fb5998524d 100644 --- a/python/ctsm/longitude.py +++ b/python/ctsm/longitude.py @@ -58,7 +58,7 @@ def _convert_lon_type_180_to_360(lon_in): return lon_out -def _detect_lon_type(lon_in): +def detect_lon_type(lon_in): """ Detect longitude type of a given numeric. If lon_in contains more than one number (as in a list or Numpy array), this function will assume all members are of the same type if (a) there is at diff --git a/python/ctsm/site_and_regional/regional_case.py b/python/ctsm/site_and_regional/regional_case.py index 94f6011569..ed91f3d474 100644 --- a/python/ctsm/site_and_regional/regional_case.py +++ b/python/ctsm/site_and_regional/regional_case.py @@ -19,7 +19,7 @@ from ctsm.utils import add_tag_to_filename from ctsm.utils import abort from ctsm.config_utils import check_lon1_lt_lon2 -from ctsm.longitude import Longitude, _detect_lon_type +from ctsm.longitude import Longitude, detect_lon_type logger = logging.getLogger(__name__) @@ -142,7 +142,7 @@ def _subset_lon_lat(self, x_dim, y_dim, f_in): # Detect longitude type (180 or 360) of input file, throwing a helpful error if it can't be # determined. - f_lon_type = _detect_lon_type(lon) + f_lon_type = detect_lon_type(lon) lon1_type = self.lon1.lon_type() lon2_type = self.lon2.lon_type() if lon1_type != lon2_type: diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index ed8b4b5562..d9a3c233aa 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -15,7 +15,7 @@ # -- import local classes for this script from ctsm.site_and_regional.base_case import BaseCase, USRDAT_DIR, DatmFiles from ctsm.utils import add_tag_to_filename, ensure_iterable -from ctsm.longitude import _detect_lon_type +from ctsm.longitude import detect_lon_type logger = logging.getLogger(__name__) @@ -154,7 +154,7 @@ def convert_plon_to_filetype_if_needed(self, lon_da): file. """ plon_in = self.plon - f_lon_type = _detect_lon_type(lon_da) + f_lon_type = detect_lon_type(lon_da) plon_type = plon_in.lon_type() if f_lon_type == plon_type: plon_out = plon_in.get(plon_type) diff --git a/python/ctsm/subset_data.py b/python/ctsm/subset_data.py index 81f1f703f3..b44b3fddbe 100644 --- a/python/ctsm/subset_data.py +++ b/python/ctsm/subset_data.py @@ -69,7 +69,7 @@ from ctsm.path_utils import path_to_ctsm_root from ctsm.utils import abort from ctsm.config_utils import check_lon1_lt_lon2 -from ctsm.longitude import Longitude, _detect_lon_type +from ctsm.longitude import Longitude, detect_lon_type # -- import ctsm logging flags from ctsm.ctsm_logging import ( @@ -833,10 +833,10 @@ def process_args(args): if any(lon_arg_values): if args.lon_type is None: if hasattr(args, "plon"): - args.lon_type = _detect_lon_type(args.plon) + args.lon_type = detect_lon_type(args.plon) else: - lon1_type = _detect_lon_type(args.lon1) - lon2_type = _detect_lon_type(args.lon2) + lon1_type = detect_lon_type(args.lon1) + lon2_type = detect_lon_type(args.lon2) if lon1_type != lon2_type: raise argparse.ArgumentTypeError( "--lon1 and --lon2 seem to be of different types" diff --git a/python/ctsm/test/test_unit_longitude.py b/python/ctsm/test/test_unit_longitude.py index 6bf7ec53e2..6766f90764 100644 --- a/python/ctsm/test/test_unit_longitude.py +++ b/python/ctsm/test/test_unit_longitude.py @@ -10,7 +10,7 @@ from ctsm.longitude import Longitude from ctsm.longitude import _convert_lon_type_180_to_360, _convert_lon_type_360_to_180 from ctsm.longitude import _check_lon_type_180, _check_lon_type_360 -from ctsm.longitude import _detect_lon_type +from ctsm.longitude import detect_lon_type # Allow test names that pylint doesn't like; otherwise hard to make them # readable @@ -369,57 +369,57 @@ def test_lon_compare_notlon_error(self): def test_detect_lon_type_mid_180(self): """test that detect_lon_type works for an unambiguously 180 value""" - self.assertEqual(_detect_lon_type(-150), 180) + self.assertEqual(detect_lon_type(-150), 180) def test_detect_lon_type_min_180(self): """test that detect_lon_type works at -180""" - self.assertEqual(_detect_lon_type(-180), 180) + self.assertEqual(detect_lon_type(-180), 180) def test_detect_lon_type_mid_360(self): """test that detect_lon_type works for an unambiguously 360 value""" - self.assertEqual(_detect_lon_type(355), 360) + self.assertEqual(detect_lon_type(355), 360) def test_detect_lon_type_max_360(self): """test that detect_lon_type works at 360""" - self.assertEqual(_detect_lon_type(360), 360) + self.assertEqual(detect_lon_type(360), 360) def test_detect_lon_type_list_180(self): """test that detect_lon_type works for a list with just one unambiguously 180 value""" - self.assertEqual(_detect_lon_type([-150, 150]), 180) + self.assertEqual(detect_lon_type([-150, 150]), 180) def test_detect_lon_type_list_360(self): """test that detect_lon_type works for a list with just one unambiguously 360 value""" - self.assertEqual(_detect_lon_type([256, 150]), 360) + self.assertEqual(detect_lon_type([256, 150]), 360) def test_detect_lon_type_ambig(self): """test that detect_lon_type fails if ambiguous""" with self.assertRaisesRegex(ArgumentTypeError, r"Longitude\(s\) ambiguous"): - _detect_lon_type(150) + detect_lon_type(150) def test_detect_lon_type_list_ambig(self): """test that detect_lon_type fails for an ambiguous list""" with self.assertRaisesRegex(ArgumentTypeError, r"Longitude\(s\) ambiguous"): - _detect_lon_type([150, 170]) + detect_lon_type([150, 170]) def test_detect_lon_type_list_both(self): """test that detect_lon_type fails for a list with unambiguous members of both types""" with self.assertRaisesRegex(RuntimeError, r"Longitude array contains values of both types"): - _detect_lon_type([-150, 270]) + detect_lon_type([-150, 270]) def test_detect_lon_type_ambig0(self): """test that detect_lon_type fails at 0""" with self.assertRaisesRegex(ArgumentTypeError, r"Longitude\(s\) ambiguous"): - _detect_lon_type(0) + detect_lon_type(0) def test_detect_lon_type_oob_low(self): """test that detect_lon_type fails if out of bounds below min""" with self.assertRaisesRegex(ValueError, r"\(Minimum\) longitude < -180"): - _detect_lon_type(-300) + detect_lon_type(-300) def test_detect_lon_type_oob_high(self): """test that detect_lon_type fails if out of bounds above max""" with self.assertRaisesRegex(ValueError, r"\(Maximum\) longitude > 360"): - _detect_lon_type(500) + detect_lon_type(500) def test_list_as_lon(self): """ From ea011eda8d2616e0901db31c316389e326f60e23 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 15:13:45 -0600 Subject: [PATCH 67/97] Rename plon_converted to plon_float. --- .../site_and_regional/single_point_case.py | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index d9a3c233aa..e122b2c251 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -381,10 +381,10 @@ def create_landuse_at_point(self, indir, file, user_mods_dir): f_in = self.create_1d_coord(fluse_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) + plon_float = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) # extract gridcell closest to plon/plat - f_out = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") + f_out = f_in.sel(lsmlon=plon_float, lsmlat=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["lsmlat", "lsmlon"]) @@ -519,10 +519,10 @@ def create_surfdata_at_point(self, indir, file, user_mods_dir, specify_fsurf_out f_in = self.create_1d_coord(fsurf_in, "LONGXY", "LATIXY", "lsmlon", "lsmlat") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) + plon_float = self.convert_plon_to_filetype_if_needed(f_in["lsmlon"]) # extract gridcell closest to plon/plat - f_tmp = f_in.sel(lsmlon=plon_converted, lsmlat=self.plat, method="nearest") + f_tmp = f_in.sel(lsmlon=plon_float, lsmlat=self.plat, method="nearest") # expand dimensions f_tmp = f_tmp.expand_dims(["lsmlat", "lsmlon"]).copy(deep=True) @@ -548,10 +548,10 @@ def create_surfdata_at_point(self, indir, file, user_mods_dir, specify_fsurf_out # update lsmlat and lsmlon to match site specific instead of the nearest point # we do this so that if we create user_mods the PTS_LON and PTS_LAT in CIME match # the surface data coordinates - which is required - f_out["lsmlon"] = np.atleast_1d(plon_converted) + f_out["lsmlon"] = np.atleast_1d(plon_float) f_out["lsmlat"] = np.atleast_1d(self.plat) f_out["LATIXY"][:, :] = self.plat - f_out["LONGXY"][:, :] = plon_converted + f_out["LONGXY"][:, :] = plon_float # update attributes self.update_metadata(f_out) @@ -592,10 +592,10 @@ def create_datmdomain_at_point(self, datm_tuple: DatmFiles): f_in = self.create_1d_coord(fdatmdomain_in, "xc", "yc", "ni", "nj") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lon"]) + plon_float = self.convert_plon_to_filetype_if_needed(f_in["lon"]) # extract gridcell closest to plon/plat - f_out = f_in.sel(ni=plon_converted, nj=self.plat, method="nearest") + f_out = f_in.sel(ni=plon_float, nj=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["nj", "ni"]) @@ -618,10 +618,10 @@ def extract_datm_at(self, file_in, file_out): f_in = self.create_1d_coord(file_in, "LONGXY", "LATIXY", "lon", "lat") # get point longitude, converting to match file type if needed - plon_converted = self.convert_plon_to_filetype_if_needed(f_in["lon"]) + plon_float = self.convert_plon_to_filetype_if_needed(f_in["lon"]) # extract gridcell closest to plon/plat - f_out = f_in.sel(lon=plon_converted, lat=self.plat, method="nearest") + f_out = f_in.sel(lon=plon_float, lat=self.plat, method="nearest") # expand dimensions f_out = f_out.expand_dims(["lat", "lon"]) From 1244d9eaea8cb6c27ca5abdafa2ef77fc263a087 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 16:02:29 -0600 Subject: [PATCH 68/97] subset_data: Disallow --create-datm with GSWP3 data. - See [Issue #3269: subset_data --create-datm errors with GSWP3 data](https://github.com/ESCOMP/CTSM/issues/3269) - Resolves ESCOMP/CTSM#2960 --- .../generic-single-point-regional.rst | 2 +- python/ctsm/subset_data.py | 39 ++++++++++++------- python/ctsm/test/test_unit_subset_data.py | 33 +++++++++++++++- python/ctsm/test/testinputs/default_data.cfg | 15 +------ .../test/testinputs/default_data_gswp3.cfg | 30 ++++++++++++++ tools/site_and_regional/default_data_1850.cfg | 15 +------ tools/site_and_regional/default_data_2000.cfg | 15 +------ 7 files changed, 91 insertions(+), 58 deletions(-) create mode 100644 python/ctsm/test/testinputs/default_data_gswp3.cfg diff --git a/doc/source/users_guide/running-single-points/generic-single-point-regional.rst b/doc/source/users_guide/running-single-points/generic-single-point-regional.rst index 3d418b00fb..7e0b1e72fd 100644 --- a/doc/source/users_guide/running-single-points/generic-single-point-regional.rst +++ b/doc/source/users_guide/running-single-points/generic-single-point-regional.rst @@ -41,7 +41,7 @@ You can also have the script subset land-use data. See the help (``tools/site_an .. note:: This script defaults to subsetting specific surface data, land-use timeseries, and the CRUJRA2024 DATM data. It can currently only be run as-is on Derecho. If you're not on Derecho, use ``--inputdata-dir`` to specify where the top level of your CESM input data is. - Also, to subset GSWP3 instead of CRUJRA2024 DATM data, you currently need to hardwire ``datm_type = "datm_gswp3"`` (instead of the default ``"datm_crujra"``) in ``python/ctsm/subset_data.py``. + Using ``--create-datm`` with GSWP3 data is no longer supported; see `CTSM issue #3269 `_. diff --git a/python/ctsm/subset_data.py b/python/ctsm/subset_data.py index b44b3fddbe..96e2aa379c 100644 --- a/python/ctsm/subset_data.py +++ b/python/ctsm/subset_data.py @@ -629,15 +629,26 @@ def setup_files(args, defaults, cesmroot): file_dict = {"main_dir": clmforcingindir} # DATM data - # TODO Issue #2960: Make datm_type a user option at the command - # line. For reference, this option affects three .cfg files: + # For reference, this option affects three .cfg files: # tools/site_and_regional/default_data_1850.cfg # tools/site_and_regional/default_data_2000.cfg # python/ctsm/test/testinputs/default_data.cfg if args.create_datm: - datm_type = "datm_crujra" # also available: datm_type = "datm_gswp3" + datm_cfg_section = "datm" + + # Issue #3269: Changes in PR #3259 mean that --create-datm won't work with GSWP3 + settings_to_check_for_gswp3 = ["solartag", "prectag", "tpqwtag"] + for setting in settings_to_check_for_gswp3: + value = defaults.get(datm_cfg_section, setting) + if "gswp3" in value.lower(): + msg = ( + "--create-datm is no longer supported for GSWP3 data; " + "see https://github.com/ESCOMP/CTSM/issues/3269" + ) + raise NotImplementedError(msg) + dir_output_datm = "datmdata" - dir_input_datm = os.path.join(clmforcingindir, defaults.get(datm_type, "dir")) + dir_input_datm = os.path.join(clmforcingindir, defaults.get(datm_cfg_section, "dir")) if not os.path.isdir(os.path.join(args.out_dir, dir_output_datm)): os.mkdir(os.path.join(args.out_dir, dir_output_datm)) logger.info("dir_input_datm : %s", dir_input_datm) @@ -645,16 +656,16 @@ def setup_files(args, defaults, cesmroot): file_dict["datm_tuple"] = DatmFiles( dir_input_datm, dir_output_datm, - defaults.get(datm_type, "domain"), - defaults.get(datm_type, "solardir"), - defaults.get(datm_type, "precdir"), - defaults.get(datm_type, "tpqwdir"), - defaults.get(datm_type, "solartag"), - defaults.get(datm_type, "prectag"), - defaults.get(datm_type, "tpqwtag"), - defaults.get(datm_type, "solarname"), - defaults.get(datm_type, "precname"), - defaults.get(datm_type, "tpqwname"), + defaults.get(datm_cfg_section, "domain"), + defaults.get(datm_cfg_section, "solardir"), + defaults.get(datm_cfg_section, "precdir"), + defaults.get(datm_cfg_section, "tpqwdir"), + defaults.get(datm_cfg_section, "solartag"), + defaults.get(datm_cfg_section, "prectag"), + defaults.get(datm_cfg_section, "tpqwtag"), + defaults.get(datm_cfg_section, "solarname"), + defaults.get(datm_cfg_section, "precname"), + defaults.get(datm_cfg_section, "tpqwname"), ) # if the crop flag is on - we need to use a different land use and surface data file diff --git a/python/ctsm/test/test_unit_subset_data.py b/python/ctsm/test/test_unit_subset_data.py index eeb0a9a38a..c4ce21e959 100755 --- a/python/ctsm/test/test_unit_subset_data.py +++ b/python/ctsm/test/test_unit_subset_data.py @@ -7,6 +7,8 @@ """ import unittest +import tempfile +import shutil import configparser import argparse import os @@ -85,6 +87,18 @@ def setUp(self): self.defaults = configparser.ConfigParser() self.defaults.read(os.path.join(self.cesmroot, "tools/site_and_regional", DEFAULTS_FILE)) + # Work in temporary directory + self._previous_dir = os.getcwd() + self._tempdir = tempfile.mkdtemp() + os.chdir(self._tempdir) # cd to tempdir + + def tearDown(self): + """ + Remove temporary directory + """ + os.chdir(self._previous_dir) + shutil.rmtree(self._tempdir, ignore_errors=True) + def test_inputdata_setup_files_basic(self): """ Test @@ -116,6 +130,23 @@ def test_inputdata_setup_files_inputdata_dne(self): with self.assertRaisesRegex(SystemExit, "inputdata directory does not exist"): setup_files(self.args, self.defaults, self.cesmroot) + def test_inputdata_setup_files_gswp3_error(self): + """ + Test that error is thrown if user tries to --create-datm GSWP3 + """ + cfg_file = os.path.join( + _CTSM_PYTHON, "ctsm", "test", "testinputs", "default_data_gswp3.cfg" + ) + sys.argv = ["subset_data", "point", "--create-datm", "--cfg-file", cfg_file] + self.args = self.parser.parse_args() + self.defaults = configparser.ConfigParser() + self.defaults.read(self.args.config_file) + + with self.assertRaisesRegex( + NotImplementedError, "https://github.com/ESCOMP/CTSM/issues/3269" + ): + setup_files(self.args, self.defaults, self.cesmroot) + def test_check_args_nooutput(self): """ Test that check args aborts when no-output is asked for @@ -229,7 +260,7 @@ def test_check_args_outsurfdat_fails_without_overwrite(self): for an existing dataset without the overwrite option """ outfile = os.path.join( - os.getcwd(), + _CTSM_PYTHON, "ctsm/test/testinputs/", "surfdata_1x1_mexicocityMEX_hist_16pfts_CMIP6_2000_c231103.nc", ) diff --git a/python/ctsm/test/testinputs/default_data.cfg b/python/ctsm/test/testinputs/default_data.cfg index a832d810cc..60c012561c 100644 --- a/python/ctsm/test/testinputs/default_data.cfg +++ b/python/ctsm/test/testinputs/default_data.cfg @@ -1,7 +1,7 @@ [main] clmforcingindir = /glade/campaign/cesm/cesmdata/cseg/inputdata -[datm_crujra] +[datm] dir = atm/datm7/atm_forcing.datm7.CRUJRA.0.5d.c20241231/three_stream domain = domain.crujra_v2.3_0.5x0.5.c220801.nc solardir = . @@ -14,19 +14,6 @@ solarname = CLMCRUJRA2024.Solar precname = CLMCRUJRA2024.Precip tpqwname = CLMCRUJRA2024.TPQW -[datm_gswp3] -dir = atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516 -domain = domain.lnd.360x720_gswp3.0v1.c170606.nc -solardir = Solar -precdir = Precip -tpqwdir = TPHWL -solartag = clmforc.GSWP3.c2011.0.5x0.5.Solr. -prectag = clmforc.GSWP3.c2011.0.5x0.5.Prec. -tpqwtag = clmforc.GSWP3.c2011.0.5x0.5.TPQWL. -solarname = CLMGSWP3v1.Solar -precname = CLMGSWP3v1.Precip -tpqwname = CLMGSWP3v1.TPQW - [surfdat] dir = lnd/clm2/surfdata_esmf/ctsm5.3.0 surfdat_16pft = surfdata_0.9x1.25_hist_2000_16pfts_c240908.nc diff --git a/python/ctsm/test/testinputs/default_data_gswp3.cfg b/python/ctsm/test/testinputs/default_data_gswp3.cfg new file mode 100644 index 0000000000..09e1463eb2 --- /dev/null +++ b/python/ctsm/test/testinputs/default_data_gswp3.cfg @@ -0,0 +1,30 @@ +[main] +clmforcingindir = /glade/campaign/cesm/cesmdata/cseg/inputdata + +[datm] +dir = atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516 +domain = domain.lnd.360x720_gswp3.0v1.c170606.nc +solardir = Solar +precdir = Precip +tpqwdir = TPHWL +solartag = clmforc.GSWP3.c2011.0.5x0.5.Solr. +prectag = clmforc.GSWP3.c2011.0.5x0.5.Prec. +tpqwtag = clmforc.GSWP3.c2011.0.5x0.5.TPQWL. +solarname = CLMGSWP3v1.Solar +precname = CLMGSWP3v1.Precip +tpqwname = CLMGSWP3v1.TPQW + +[surfdat] +dir = lnd/clm2/surfdata_esmf/ctsm5.3.0 +surfdat_16pft = surfdata_0.9x1.25_hist_2000_16pfts_c240908.nc +surfdat_78pft = surfdata_0.9x1.25_hist_2000_78pfts_c240908.nc +mesh_dir = share/meshes/ +mesh_surf = fv0.9x1.25_141008_ESMFmesh.nc + +[landuse] +dir = lnd/clm2/surfdata_esmf/ctsm5.3.0 +landuse_16pft = landuse.timeseries_0.9x1.25_SSP2-4.5_1850-2100_78pfts_c240908.nc +landuse_78pft = landuse.timeseries_0.9x1.25_SSP2-4.5_1850-2100_78pfts_c240908.nc + +[domain] +file = share/domains/domain.lnd.fv0.9x1.25_gx1v7.151020.nc diff --git a/tools/site_and_regional/default_data_1850.cfg b/tools/site_and_regional/default_data_1850.cfg index 3c9f28c0a2..ce68b1debf 100644 --- a/tools/site_and_regional/default_data_1850.cfg +++ b/tools/site_and_regional/default_data_1850.cfg @@ -1,7 +1,7 @@ [main] clmforcingindir = /glade/campaign/cesm/cesmdata/inputdata -[datm_crujra] +[datm] dir = atm/datm7/atm_forcing.datm7.CRUJRA.0.5d.c20241231/three_stream domain = domain.crujra_v2.3_0.5x0.5.c220801.nc solardir = . @@ -14,19 +14,6 @@ solarname = CLMCRUJRA2024.Solar precname = CLMCRUJRA2024.Precip tpqwname = CLMCRUJRA2024.TPQW -[datm_gswp3] -dir = atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516 -domain = domain.lnd.360x720_gswp3.0v1.c170606.nc -solardir = Solar -precdir = Precip -tpqwdir = TPHWL -solartag = clmforc.GSWP3.c2011.0.5x0.5.Solr. -prectag = clmforc.GSWP3.c2011.0.5x0.5.Prec. -tpqwtag = clmforc.GSWP3.c2011.0.5x0.5.TPQWL. -solarname = CLMGSWP3v1.Solar -precname = CLMGSWP3v1.Precip -tpqwname = CLMGSWP3v1.TPQW - [surfdat] dir = lnd/clm2/surfdata_esmf/ctsm5.3.0 surfdat_78pft = surfdata_0.9x1.25_hist_1850_78pfts_c240908.nc diff --git a/tools/site_and_regional/default_data_2000.cfg b/tools/site_and_regional/default_data_2000.cfg index a832d810cc..60c012561c 100644 --- a/tools/site_and_regional/default_data_2000.cfg +++ b/tools/site_and_regional/default_data_2000.cfg @@ -1,7 +1,7 @@ [main] clmforcingindir = /glade/campaign/cesm/cesmdata/cseg/inputdata -[datm_crujra] +[datm] dir = atm/datm7/atm_forcing.datm7.CRUJRA.0.5d.c20241231/three_stream domain = domain.crujra_v2.3_0.5x0.5.c220801.nc solardir = . @@ -14,19 +14,6 @@ solarname = CLMCRUJRA2024.Solar precname = CLMCRUJRA2024.Precip tpqwname = CLMCRUJRA2024.TPQW -[datm_gswp3] -dir = atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516 -domain = domain.lnd.360x720_gswp3.0v1.c170606.nc -solardir = Solar -precdir = Precip -tpqwdir = TPHWL -solartag = clmforc.GSWP3.c2011.0.5x0.5.Solr. -prectag = clmforc.GSWP3.c2011.0.5x0.5.Prec. -tpqwtag = clmforc.GSWP3.c2011.0.5x0.5.TPQWL. -solarname = CLMGSWP3v1.Solar -precname = CLMGSWP3v1.Precip -tpqwname = CLMGSWP3v1.TPQW - [surfdat] dir = lnd/clm2/surfdata_esmf/ctsm5.3.0 surfdat_16pft = surfdata_0.9x1.25_hist_2000_16pfts_c240908.nc From 05a2c28ba7ca1cffbd1d210736492277cf27bd7d Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 16:23:28 -0600 Subject: [PATCH 69/97] subset_data: Generalize a comment. --- python/ctsm/subset_data.py | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/python/ctsm/subset_data.py b/python/ctsm/subset_data.py index 96e2aa379c..820391ff8b 100644 --- a/python/ctsm/subset_data.py +++ b/python/ctsm/subset_data.py @@ -629,10 +629,8 @@ def setup_files(args, defaults, cesmroot): file_dict = {"main_dir": clmforcingindir} # DATM data - # For reference, this option affects three .cfg files: - # tools/site_and_regional/default_data_1850.cfg - # tools/site_and_regional/default_data_2000.cfg - # python/ctsm/test/testinputs/default_data.cfg + # To find the affected files, from the top level of ctsm, do: + # grep "\[datm\]" $(find . -type f -name "*cfg") if args.create_datm: datm_cfg_section = "datm" From ecbe0fe13722283542aefbc0cf0146cbc8bd023c Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 16:26:12 -0600 Subject: [PATCH 70/97] test_sys_subset_data.py: Explain caller_n. --- python/ctsm/test/test_sys_subset_data.py | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/python/ctsm/test/test_sys_subset_data.py b/python/ctsm/test/test_sys_subset_data.py index 2c47919dae..39d448cccd 100644 --- a/python/ctsm/test/test_sys_subset_data.py +++ b/python/ctsm/test/test_sys_subset_data.py @@ -40,6 +40,10 @@ def tearDown(self): def _check_result_file_matches_expected(self, expected_output_files, caller_n): """ Loop through a list of output files, making sure they match what we expect. + + caller_n should be an integer giving the number of levels above this function you need to + traverse before you hit the actual test name. If the test is calling this function directly, + caller_n = 1. If the test is calling a function that calls this function, caller_n = 2. Etc. """ all_files_present_and_match = True for basename in expected_output_files: From 56a36f1adec07c4478d7e9b3de7c72838cb36661 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 16:38:08 -0600 Subject: [PATCH 71/97] plumber2 scripts: Remove addition of CTSM python dir to path. --- python/ctsm/site_and_regional/plumber2_surf_wrapper.py | 5 ----- python/ctsm/site_and_regional/plumber2_usermods.py | 5 ----- 2 files changed, 10 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 367fa97a77..86234aae9c 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -22,14 +22,9 @@ import argparse import logging -import os import sys import tqdm -# Get the ctsm tools -_CTSM_PYTHON = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "python")) -sys.path.insert(1, _CTSM_PYTHON) - # pylint:disable=wrong-import-position from ctsm.site_and_regional.plumber2_shared import PLUMBER2_SITES_CSV, read_plumber2_sites_csv from ctsm import subset_data diff --git a/python/ctsm/site_and_regional/plumber2_usermods.py b/python/ctsm/site_and_regional/plumber2_usermods.py index 6fcd4a6224..7c8f37b1b5 100644 --- a/python/ctsm/site_and_regional/plumber2_usermods.py +++ b/python/ctsm/site_and_regional/plumber2_usermods.py @@ -11,13 +11,8 @@ from __future__ import print_function import os -import sys import tqdm -# Get the ctsm tools -_CTSM_PYTHON = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "python")) -sys.path.insert(1, _CTSM_PYTHON) - # pylint:disable=wrong-import-position from ctsm.site_and_regional.plumber2_shared import read_plumber2_sites_csv From cf86bfc0563d0cd089078b8be9cda464978afa24 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 16:43:18 -0600 Subject: [PATCH 72/97] Simplify PLUMBER2_SITES_CSV path. --- python/ctsm/site_and_regional/plumber2_shared.py | 16 ++++++---------- 1 file changed, 6 insertions(+), 10 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_shared.py b/python/ctsm/site_and_regional/plumber2_shared.py index 491b35e7d2..d4ab9d00b3 100644 --- a/python/ctsm/site_and_regional/plumber2_shared.py +++ b/python/ctsm/site_and_regional/plumber2_shared.py @@ -4,17 +4,13 @@ import os import pandas as pd +from ctsm.path_utils import path_to_ctsm_root -PLUMBER2_SITES_CSV = os.path.realpath( - os.path.join( - os.path.dirname(__file__), - os.pardir, - os.pardir, - os.pardir, - "tools", - "site_and_regional", - "PLUMBER2_sites.csv", - ) +PLUMBER2_SITES_CSV = os.path.join( + path_to_ctsm_root(), + "tools", + "site_and_regional", + "PLUMBER2_sites.csv", ) From 602fe61fbfcd36b27fa1bdddd5bfb88adc23dea9 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:20:27 -0600 Subject: [PATCH 73/97] Move Python constants for PFT numbers to pft_utils.py. --- python/ctsm/pft_utils.py | 8 ++++++++ python/ctsm/site_and_regional/single_point_case.py | 5 +---- python/ctsm/test/test_unit_singlept_data.py | 3 ++- python/ctsm/test/test_unit_singlept_data_surfdata.py | 3 ++- python/ctsm/toolchain/gen_mksurfdata_namelist.py | 3 ++- 5 files changed, 15 insertions(+), 7 deletions(-) create mode 100644 python/ctsm/pft_utils.py diff --git a/python/ctsm/pft_utils.py b/python/ctsm/pft_utils.py new file mode 100644 index 0000000000..c7e36c9338 --- /dev/null +++ b/python/ctsm/pft_utils.py @@ -0,0 +1,8 @@ +""" +Constants and functions relating to PFTs +""" + +MIN_PFT = 0 # bare ground +NAT_PFT = 15 # natural pfts +NUM_PFT = 17 # for runs with generic crops +MAX_PFT = 78 # for runs with explicit crops diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index db33d875fc..d9e1a1513f 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -16,13 +16,10 @@ from ctsm.site_and_regional.base_case import BaseCase, USRDAT_DIR, DatmFiles from ctsm.utils import add_tag_to_filename, ensure_iterable from ctsm.longitude import detect_lon_type +from ctsm.pft_utils import NAT_PFT, NUM_PFT, MAX_PFT logger = logging.getLogger(__name__) -NAT_PFT = 15 # natural pfts -NUM_PFT = 17 # for runs with generic crops -MAX_PFT = 78 # for runs with explicit crops - class SinglePointCase(BaseCase): """ diff --git a/python/ctsm/test/test_unit_singlept_data.py b/python/ctsm/test/test_unit_singlept_data.py index 644af82588..bc9bd1adb3 100755 --- a/python/ctsm/test/test_unit_singlept_data.py +++ b/python/ctsm/test/test_unit_singlept_data.py @@ -18,6 +18,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase +from ctsm.pft_utils import MAX_PFT # pylint: disable=invalid-name @@ -223,7 +224,7 @@ def test_check_dom_pft_mixed_range(self): overwrite=self.overwrite, ) single_point.dom_pft = [1, 5, 15] - single_point.num_pft = 78 + single_point.num_pft = MAX_PFT with self.assertRaisesRegex( argparse.ArgumentTypeError, "You are subsetting using mixed land*" ): diff --git a/python/ctsm/test/test_unit_singlept_data_surfdata.py b/python/ctsm/test/test_unit_singlept_data_surfdata.py index 2106799a4b..71312c9db6 100755 --- a/python/ctsm/test/test_unit_singlept_data_surfdata.py +++ b/python/ctsm/test/test_unit_singlept_data_surfdata.py @@ -23,6 +23,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase +from ctsm.pft_utils import MAX_PFT # pylint: disable=invalid-name # pylint: disable=too-many-lines @@ -667,7 +668,7 @@ class TestSinglePointCaseSurfaceCrop(unittest.TestCase): dom_pft = [17] evenly_split_cropland = False pct_pft = None - num_pft = 78 + num_pft = MAX_PFT cth = 0.9 cbh = 0.1 include_nonveg = False diff --git a/python/ctsm/toolchain/gen_mksurfdata_namelist.py b/python/ctsm/toolchain/gen_mksurfdata_namelist.py index 31fcbfe8ff..45e17bd504 100755 --- a/python/ctsm/toolchain/gen_mksurfdata_namelist.py +++ b/python/ctsm/toolchain/gen_mksurfdata_namelist.py @@ -15,6 +15,7 @@ from ctsm.path_utils import path_to_ctsm_root, path_to_cime from ctsm.ctsm_logging import setup_logging_pre_config, add_logging_args, process_logging_args +from ctsm.pft_utils import MAX_PFT logger = logging.getLogger(__name__) @@ -308,7 +309,7 @@ def main(): if nocrop_flag: num_pft = "16" else: - num_pft = "78" + num_pft = str(MAX_PFT) logger.info("num_pft is %s", num_pft) # Write out if surface dataset will be created From 8ca6e4556068f25155230bfc663aaaf64c21a8ff Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:28:14 -0600 Subject: [PATCH 74/97] Rename and replace some PFT number constants for clarity. - Replace NAT_PFT=15 with MAX_NAT_PFT=14 - Replace NUM_PFT=17 with MAX_PFT_GENERICCROPS=16 - Rename MAX_PFT to MAX_PFT_MANAGEDCROPS --- python/ctsm/pft_utils.py | 6 ++-- .../site_and_regional/single_point_case.py | 28 +++++++++---------- python/ctsm/test/test_unit_singlept_data.py | 4 +-- .../test/test_unit_singlept_data_surfdata.py | 4 +-- .../ctsm/toolchain/gen_mksurfdata_namelist.py | 4 +-- 5 files changed, 23 insertions(+), 23 deletions(-) diff --git a/python/ctsm/pft_utils.py b/python/ctsm/pft_utils.py index c7e36c9338..9588564c78 100644 --- a/python/ctsm/pft_utils.py +++ b/python/ctsm/pft_utils.py @@ -3,6 +3,6 @@ """ MIN_PFT = 0 # bare ground -NAT_PFT = 15 # natural pfts -NUM_PFT = 17 # for runs with generic crops -MAX_PFT = 78 # for runs with explicit crops +MAX_NAT_PFT = 14 # maximum natural pft +MAX_PFT_GENERICCROPS = 16 # for runs with generic crops +MAX_PFT_MANAGEDCROPS = 78 # for runs with explicit crops diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index d9e1a1513f..c5777093ab 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -16,7 +16,7 @@ from ctsm.site_and_regional.base_case import BaseCase, USRDAT_DIR, DatmFiles from ctsm.utils import add_tag_to_filename, ensure_iterable from ctsm.longitude import detect_lon_type -from ctsm.pft_utils import NAT_PFT, NUM_PFT, MAX_PFT +from ctsm.pft_utils import MAX_NAT_PFT, MAX_PFT_GENERICCROPS, MAX_PFT_MANAGEDCROPS logger = logging.getLogger(__name__) @@ -187,20 +187,20 @@ def check_dom_pft(self): same range. e.g. If users specified multiple dom_pft, they should be either in : - - 0 - NAT_PFT-1 range + - 0 - MAX_NAT_PFT range or - - NAT_PFT - MAX_PFT range + - MAX_NAT_PFT+1 - MAX_PFT_MANAGEDCROPS range - give an error: mixed land units not possible ------------- Raises: Error (ArgumentTypeError): - If any dom_pft is bigger than MAX_PFT. + If any dom_pft is bigger than MAX_PFT_MANAGEDCROPS. Error (ArgumentTypeError): If any dom_pft is less than 1. Error (ArgumentTypeError): If mixed land units are chosen. - dom_pft values are both in range of (0 - NAT_PFT-1) and (NAT_PFT - MAX_PFT). + dom_pft values are both in range of (0 - MAX_NAT_PFT) and (MAX_NAT_PFT+1 - MAX_PFT_MANAGEDCROPS). """ @@ -214,8 +214,8 @@ def check_dom_pft(self): min_dom_pft = min(self.dom_pft) max_dom_pft = max(self.dom_pft) - # -- check dom_pft values should be between 0-MAX_PFT - if min_dom_pft < 0 or max_dom_pft > MAX_PFT: + # -- check dom_pft values should be between 0-MAX_PFT_MANAGEDCROPS + if min_dom_pft < 0 or max_dom_pft > MAX_PFT_MANAGEDCROPS: err_msg = "values for --dompft should be between 1 and 78." raise argparse.ArgumentTypeError(err_msg) @@ -225,17 +225,17 @@ def check_dom_pft(self): raise argparse.ArgumentTypeError(err_msg) # -- check dom_pft vs MAX_pft - if self.num_pft - 1 < max_dom_pft < NUM_PFT: + if self.num_pft - 1 < max_dom_pft <= MAX_PFT_GENERICCROPS: logger.info( "WARNING, you trying to run with generic crops (16 PFT surface dataset)" ) # -- check if all dom_pft are in the same range: - if min_dom_pft < NAT_PFT <= max_dom_pft: + if min_dom_pft <= MAX_NAT_PFT < max_dom_pft: err_msg = ( "You are subsetting using mixed land units that have both " "natural pfts and crop cfts. Check your surface dataset.\n" - f"{min_dom_pft} < {NAT_PFT} <= {max_dom_pft}\n" + f"{min_dom_pft} <= {MAX_NAT_PFT} < {max_dom_pft}\n" ) raise argparse.ArgumentTypeError(err_msg) @@ -423,7 +423,7 @@ def modify_surfdata_atpoint(self, f_orig): if self.dom_pft is not None: max_dom_pft = max(self.dom_pft) # -- First initialize everything: - if max_dom_pft < NAT_PFT: + if max_dom_pft <= MAX_NAT_PFT : f_mod["PCT_NAT_PFT"][:, :, :] = 0 else: f_mod["PCT_CFT"][:, :, :] = 0 @@ -442,10 +442,10 @@ def modify_surfdata_atpoint(self, f_orig): if cth is not None: f_mod["MONTHLY_HEIGHT_TOP"][:, :, :, dom_pft] = cth f_mod["MONTHLY_HEIGHT_BOT"][:, :, :, dom_pft] = cbh - if dom_pft < NAT_PFT: + if dom_pft <= MAX_NAT_PFT: f_mod["PCT_NAT_PFT"][:, :, dom_pft] = pct_pft else: - dom_pft = dom_pft - NAT_PFT + dom_pft = dom_pft - (MAX_NAT_PFT + 1) f_mod["PCT_CFT"][:, :, dom_pft] = pct_pft # ------------------------------- @@ -463,7 +463,7 @@ def modify_surfdata_atpoint(self, f_orig): if self.dom_pft is not None: max_dom_pft = max(self.dom_pft) - if max_dom_pft < NAT_PFT: + if max_dom_pft <= MAX_NAT_PFT: f_mod["PCT_NATVEG"][:, :] = 100 f_mod["PCT_CROP"][:, :] = 0 else: diff --git a/python/ctsm/test/test_unit_singlept_data.py b/python/ctsm/test/test_unit_singlept_data.py index bc9bd1adb3..dc6d655408 100755 --- a/python/ctsm/test/test_unit_singlept_data.py +++ b/python/ctsm/test/test_unit_singlept_data.py @@ -18,7 +18,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase -from ctsm.pft_utils import MAX_PFT +from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS # pylint: disable=invalid-name @@ -224,7 +224,7 @@ def test_check_dom_pft_mixed_range(self): overwrite=self.overwrite, ) single_point.dom_pft = [1, 5, 15] - single_point.num_pft = MAX_PFT + single_point.num_pft = MAX_PFT_MANAGEDCROPS with self.assertRaisesRegex( argparse.ArgumentTypeError, "You are subsetting using mixed land*" ): diff --git a/python/ctsm/test/test_unit_singlept_data_surfdata.py b/python/ctsm/test/test_unit_singlept_data_surfdata.py index 71312c9db6..fb6cc15720 100755 --- a/python/ctsm/test/test_unit_singlept_data_surfdata.py +++ b/python/ctsm/test/test_unit_singlept_data_surfdata.py @@ -23,7 +23,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase -from ctsm.pft_utils import MAX_PFT +from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS # pylint: disable=invalid-name # pylint: disable=too-many-lines @@ -668,7 +668,7 @@ class TestSinglePointCaseSurfaceCrop(unittest.TestCase): dom_pft = [17] evenly_split_cropland = False pct_pft = None - num_pft = MAX_PFT + num_pft = MAX_PFT_MANAGEDCROPS cth = 0.9 cbh = 0.1 include_nonveg = False diff --git a/python/ctsm/toolchain/gen_mksurfdata_namelist.py b/python/ctsm/toolchain/gen_mksurfdata_namelist.py index 45e17bd504..09bbb9c268 100755 --- a/python/ctsm/toolchain/gen_mksurfdata_namelist.py +++ b/python/ctsm/toolchain/gen_mksurfdata_namelist.py @@ -15,7 +15,7 @@ from ctsm.path_utils import path_to_ctsm_root, path_to_cime from ctsm.ctsm_logging import setup_logging_pre_config, add_logging_args, process_logging_args -from ctsm.pft_utils import MAX_PFT +from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS logger = logging.getLogger(__name__) @@ -309,7 +309,7 @@ def main(): if nocrop_flag: num_pft = "16" else: - num_pft = str(MAX_PFT) + num_pft = str(MAX_PFT_MANAGEDCROPS) logger.info("num_pft is %s", num_pft) # Write out if surface dataset will be created From f6a9afcd8b5624b14d61c93ff2b7688dcad1f2a4 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:40:44 -0600 Subject: [PATCH 75/97] Use MAX_PFT_ variables in more places. --- python/ctsm/site_and_regional/single_point_case.py | 7 ++++--- python/ctsm/subset_data.py | 5 +++-- python/ctsm/test/test_unit_singlept_data.py | 10 +++++----- python/ctsm/test/test_unit_singlept_data_surfdata.py | 4 ++-- python/ctsm/toolchain/gen_mksurfdata_namelist.py | 4 ++-- 5 files changed, 16 insertions(+), 14 deletions(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index c5777093ab..8c30b4c9a3 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -216,18 +216,19 @@ def check_dom_pft(self): # -- check dom_pft values should be between 0-MAX_PFT_MANAGEDCROPS if min_dom_pft < 0 or max_dom_pft > MAX_PFT_MANAGEDCROPS: - err_msg = "values for --dompft should be between 1 and 78." + err_msg = f"values for --dompft should be between 1 and {MAX_PFT_MANAGEDCROPS}." raise argparse.ArgumentTypeError(err_msg) # -- check dom_pft vs num_pft if max_dom_pft > self.num_pft: - err_msg = "Please use --crop flag when --dompft is above 16." + err_msg = f"Please use --crop flag when --dompft is above {MAX_PFT_GENERICCROPS}." raise argparse.ArgumentTypeError(err_msg) # -- check dom_pft vs MAX_pft if self.num_pft - 1 < max_dom_pft <= MAX_PFT_GENERICCROPS: logger.info( - "WARNING, you trying to run with generic crops (16 PFT surface dataset)" + "WARNING, you are trying to run with generic crops (%s PFT surface dataset)", + MAX_PFT_GENERICCROPS, ) # -- check if all dom_pft are in the same range: diff --git a/python/ctsm/subset_data.py b/python/ctsm/subset_data.py index 820391ff8b..de4e51db9b 100644 --- a/python/ctsm/subset_data.py +++ b/python/ctsm/subset_data.py @@ -70,6 +70,7 @@ from ctsm.utils import abort from ctsm.config_utils import check_lon1_lt_lon2 from ctsm.longitude import Longitude, detect_lon_type +from ctsm.pft_utils import MAX_PFT_GENERICCROPS, MAX_PFT_MANAGEDCROPS # -- import ctsm logging flags from ctsm.ctsm_logging import ( @@ -597,9 +598,9 @@ def determine_num_pft(crop): num_pft (int) : number of pfts for surface dataset """ if crop: - num_pft = "78" + num_pft = str(MAX_PFT_MANAGEDCROPS) else: - num_pft = "16" + num_pft = str(MAX_PFT_GENERICCROPS) logger.debug("crop_flag = %s => num_pft = %s", str(crop), num_pft) return num_pft diff --git a/python/ctsm/test/test_unit_singlept_data.py b/python/ctsm/test/test_unit_singlept_data.py index dc6d655408..bf29ced331 100755 --- a/python/ctsm/test/test_unit_singlept_data.py +++ b/python/ctsm/test/test_unit_singlept_data.py @@ -18,7 +18,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase -from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS +from ctsm.pft_utils import MAX_PFT_GENERICCROPS, MAX_PFT_MANAGEDCROPS # pylint: disable=invalid-name @@ -39,7 +39,7 @@ class TestSinglePointCase(unittest.TestCase): dom_pft = [8] evenly_split_cropland = False pct_pft = None - num_pft = 16 + num_pft = MAX_PFT_GENERICCROPS cth = [0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9] cbh = [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1] include_nonveg = False @@ -132,7 +132,7 @@ def test_check_dom_pft_too_big(self): out_dir=self.out_dir, overwrite=self.overwrite, ) - single_point.dom_pft = [16, 36, 79] + single_point.dom_pft = [MAX_PFT_GENERICCROPS, 36, 79] with self.assertRaisesRegex(argparse.ArgumentTypeError, "values for --dompft should*"): single_point.check_dom_pft() @@ -162,7 +162,7 @@ def test_check_dom_pft_too_small(self): out_dir=self.out_dir, overwrite=self.overwrite, ) - single_point.dom_pft = [16, 36, -1] + single_point.dom_pft = [MAX_PFT_GENERICCROPS, 36, -1] with self.assertRaisesRegex(argparse.ArgumentTypeError, "values for --dompft should*"): single_point.check_dom_pft() @@ -193,7 +193,7 @@ def test_check_dom_pft_numpft(self): overwrite=self.overwrite, ) single_point.dom_pft = [15, 53] - single_point.num_pft = 16 + single_point.num_pft = MAX_PFT_GENERICCROPS with self.assertRaisesRegex(argparse.ArgumentTypeError, "Please use --crop*"): single_point.check_dom_pft() diff --git a/python/ctsm/test/test_unit_singlept_data_surfdata.py b/python/ctsm/test/test_unit_singlept_data_surfdata.py index fb6cc15720..d163c29e4f 100755 --- a/python/ctsm/test/test_unit_singlept_data_surfdata.py +++ b/python/ctsm/test/test_unit_singlept_data_surfdata.py @@ -23,7 +23,7 @@ # pylint: disable=wrong-import-position from ctsm import unit_testing from ctsm.site_and_regional.single_point_case import SinglePointCase -from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS +from ctsm.pft_utils import MAX_PFT_GENERICCROPS, MAX_PFT_MANAGEDCROPS # pylint: disable=invalid-name # pylint: disable=too-many-lines @@ -47,7 +47,7 @@ class TestSinglePointCaseSurfaceNoCrop(unittest.TestCase): dom_pft = [8] evenly_split_cropland = False pct_pft = None - num_pft = 16 + num_pft = MAX_PFT_GENERICCROPS cth = 0.9 cbh = 0.1 include_nonveg = False diff --git a/python/ctsm/toolchain/gen_mksurfdata_namelist.py b/python/ctsm/toolchain/gen_mksurfdata_namelist.py index 09bbb9c268..3a405bf5fa 100755 --- a/python/ctsm/toolchain/gen_mksurfdata_namelist.py +++ b/python/ctsm/toolchain/gen_mksurfdata_namelist.py @@ -15,7 +15,7 @@ from ctsm.path_utils import path_to_ctsm_root, path_to_cime from ctsm.ctsm_logging import setup_logging_pre_config, add_logging_args, process_logging_args -from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS +from ctsm.pft_utils import MAX_PFT_GENERICCROPS, MAX_PFT_MANAGEDCROPS logger = logging.getLogger(__name__) @@ -307,7 +307,7 @@ def main(): # Determine num_pft if nocrop_flag: - num_pft = "16" + num_pft = str(MAX_PFT_GENERICCROPS) else: num_pft = str(MAX_PFT_MANAGEDCROPS) logger.info("num_pft is %s", num_pft) From ddca30ff7254b628937a4ced60f1d099e701bc1d Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:44:37 -0600 Subject: [PATCH 76/97] plumber2_surf_wrapper: Avoid mentioning 78. --- .../ctsm/site_and_regional/plumber2_surf_wrapper.py | 12 ++++++++---- python/ctsm/test/test_sys_plumber2_surf_wrapper.py | 4 ++-- python/ctsm/test/test_unit_plumber2_surf_wrapper.py | 8 ++++---- 3 files changed, 14 insertions(+), 10 deletions(-) diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 86234aae9c..117ddb4a29 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -28,6 +28,7 @@ # pylint:disable=wrong-import-position from ctsm.site_and_regional.plumber2_shared import PLUMBER2_SITES_CSV, read_plumber2_sites_csv from ctsm import subset_data +from ctsm import pft_utils def get_args(): @@ -49,10 +50,13 @@ def get_args(): ) parser.add_argument( - "--78pft", - help="Create and/or modify 78-PFT surface datasets (e.g. for a non-FATES run) ", + "--crop", + help=( + f"Create and/or modify {pft_utils.MAX_PFT_MANAGEDCROPS}-PFT ", + "surface datasets (e.g. for a non-FATES run)", + ), action="store_true", - dest="pft_78", + dest="use_managed_crops", ) parser.add_argument( @@ -162,7 +166,7 @@ def main(): str(pctpft1), ] - if not args.pft_78: + if not args.use_managed_crops: # use surface dataset with 78 pfts, but overwrite to 100% 1 dominant PFT # don't set crop flag # set canopy top and bottom heights diff --git a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py index a7dcf12821..12ca561150 100755 --- a/python/ctsm/test/test_sys_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_sys_plumber2_surf_wrapper.py @@ -73,11 +73,11 @@ def test_plumber2_surf_wrapper(self): def test_plumber2_surf_wrapper_78pft(self): """ - Run the entire tool with --78pft. + Run the entire tool with --crop. CAN ONLY RUN ON SYSTEMS WITH INPUTDATA """ - sys.argv = [self.tool_path, "--78pft"] + sys.argv = [self.tool_path, "--crop"] main() # How many files do we expect? diff --git a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py index e3e677a8a6..4b84752edb 100755 --- a/python/ctsm/test/test_unit_plumber2_surf_wrapper.py +++ b/python/ctsm/test/test_unit_plumber2_surf_wrapper.py @@ -70,16 +70,16 @@ def test_parser_78pft_false_default(self): """ args = get_args() - self.assertFalse(args.pft_78) + self.assertFalse(args.use_managed_crops) def test_parser_78pft_true(self): """ - Test that --78pft sets pft_78 to True + Test that --crop sets use_managed_crops to True """ - sys.argv += ["--78pft"] + sys.argv += ["--crop"] args = get_args() - self.assertTrue(args.pft_78) + self.assertTrue(args.use_managed_crops) if __name__ == "__main__": From b7f7af386b02a5d10ac14eb27e6e2fa32843891f Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:54:29 -0600 Subject: [PATCH 77/97] plumber2_surf_wrapper: Better is_valid_pft() check. --- python/ctsm/pft_utils.py | 15 +++++++++++- .../plumber2_surf_wrapper.py | 23 ++++++------------- .../PLUMBER2_sites_invalid_pft.csv | 2 +- 3 files changed, 22 insertions(+), 18 deletions(-) diff --git a/python/ctsm/pft_utils.py b/python/ctsm/pft_utils.py index 9588564c78..40ab8b9f23 100644 --- a/python/ctsm/pft_utils.py +++ b/python/ctsm/pft_utils.py @@ -2,7 +2,20 @@ Constants and functions relating to PFTs """ -MIN_PFT = 0 # bare ground +MIN_PFT = 0 # bare ground +MIN_NAT_PFT = 1 # minimum natural pft (not including bare ground) MAX_NAT_PFT = 14 # maximum natural pft MAX_PFT_GENERICCROPS = 16 # for runs with generic crops MAX_PFT_MANAGEDCROPS = 78 # for runs with explicit crops + + +def is_valid_pft(pft_num, managed_crops): + """ + Given a number, check whether it represents a valid PFT (bare ground OK) + """ + if managed_crops: + max_allowed_pft = MAX_PFT_MANAGEDCROPS + else: + max_allowed_pft = MAX_PFT_GENERICCROPS + + return MIN_PFT <= pft_num <= max_allowed_pft diff --git a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py index 117ddb4a29..cedc6b25e0 100755 --- a/python/ctsm/site_and_regional/plumber2_surf_wrapper.py +++ b/python/ctsm/site_and_regional/plumber2_surf_wrapper.py @@ -28,7 +28,7 @@ # pylint:disable=wrong-import-position from ctsm.site_and_regional.plumber2_shared import PLUMBER2_SITES_CSV, read_plumber2_sites_csv from ctsm import subset_data -from ctsm import pft_utils +from ctsm.pft_utils import MAX_PFT_MANAGEDCROPS, is_valid_pft def get_args(): @@ -51,10 +51,8 @@ def get_args(): parser.add_argument( "--crop", - help=( - f"Create and/or modify {pft_utils.MAX_PFT_MANAGEDCROPS}-PFT ", - "surface datasets (e.g. for a non-FATES run)", - ), + help=f"Create and/or modify {MAX_PFT_MANAGEDCROPS}-PFT " + "surface datasets (e.g. for a non-FATES run)", action="store_true", dest="use_managed_crops", ) @@ -89,13 +87,6 @@ def execute(command): subset_data.main() -def is_valid_pft(pft_num): - """ - Given a number, check whether it represents a valid PFT - """ - return pft_num >= 1 - - def main(): """ Read plumber2_sites from csv, iterate through sites, and add dominant PFT @@ -135,7 +126,7 @@ def main(): # Read info for first PFT pft1 = row["pft1"] - if not is_valid_pft(pft1): + if not is_valid_pft(pft1, args.use_managed_crops): raise RuntimeError(f"pft1 must be a valid PFT; got {pft1}") pctpft1 = row["pft1-%"] cth1 = row["pft1-cth"] @@ -143,13 +134,13 @@ def main(): # Read info for second PFT, if a valid one is given in the .csv file pft2 = row["pft2"] - if is_valid_pft(pft2): + if is_valid_pft(pft2, args.use_managed_crops): pctpft2 = row["pft2-%"] cth2 = row["pft2-cth"] cbh2 = row["pft2-cbh"] # Set dominant PFT(s) - if is_valid_pft(pft2): + if is_valid_pft(pft2, args.use_managed_crops): subset_command += [ "--dompft", str(pft1), @@ -170,7 +161,7 @@ def main(): # use surface dataset with 78 pfts, but overwrite to 100% 1 dominant PFT # don't set crop flag # set canopy top and bottom heights - if is_valid_pft(pft2): + if is_valid_pft(pft2, args.use_managed_crops): subset_command += [ "--cth", str(cth1), diff --git a/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv index 2d4b7dcb57..e8f0eb8fbb 100644 --- a/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv +++ b/python/ctsm/test/testinputs/plumber2_surf_wrapper/PLUMBER2_sites_invalid_pft.csv @@ -4,5 +4,5 @@ #ATM_NCPL is specified so that the time step of the model matches the time interval specified by the atm forcing data. #longitudes must be in the range [-180,180] ,Site,Lat,Lon,pft1,pft1-%,pft1-cth,pft1-cbh,pft2,pft2-%,pft2-cth,pft2-cbh,start_year,end_year,RUN_STARTDATE,START_TOD,ATM_NCPL -26,Invalid-Pft,51.309166, 4.520560,0,19.22,21.00,10.50,7,80.78,21.00,12.08,2004,2014,2003-12-31,82800,48 +26,Invalid-Pft,51.309166, 4.520560,-1,19.22,21.00,10.50,7,80.78,21.00,12.08,2004,2014,2003-12-31,82800,48 27,BE-Lon,50.551590, 4.746130,15,100.00, 0.50, 0.01,-999,-999.00,-999.00,-999.00,2005,2014,2004-12-31,82800,48 From 75db098206b064b8b7b2a0604d3f0bf8fdb950cc Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:55:30 -0600 Subject: [PATCH 78/97] Reformat with black. --- python/ctsm/site_and_regional/single_point_case.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index 8c30b4c9a3..9f709be648 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -424,7 +424,7 @@ def modify_surfdata_atpoint(self, f_orig): if self.dom_pft is not None: max_dom_pft = max(self.dom_pft) # -- First initialize everything: - if max_dom_pft <= MAX_NAT_PFT : + if max_dom_pft <= MAX_NAT_PFT: f_mod["PCT_NAT_PFT"][:, :, :] = 0 else: f_mod["PCT_CFT"][:, :, :] = 0 From 0eb376d61cc0ba3cc91af5c75a9cdbbc4314f0d1 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:55:52 -0600 Subject: [PATCH 79/97] Add previous commit to .git-blame-ignore-revs. --- .git-blame-ignore-revs | 1 + 1 file changed, 1 insertion(+) diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs index 6cffe9dd35..518de3672c 100644 --- a/.git-blame-ignore-revs +++ b/.git-blame-ignore-revs @@ -68,3 +68,4 @@ cdf40d265cc82775607a1bf25f5f527bacc97405 3dd489af7ebe06566e2c6a1c7ade18550f1eb4ba 742cfa606039ab89602fde5fef46458516f56fd4 4ad46f46de7dde753b4653c15f05326f55116b73 +75db098206b064b8b7b2a0604d3f0bf8fdb950cc From c3258057f33952d1929a57e7baab803b460325a3 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Fri, 20 Jun 2025 17:57:08 -0600 Subject: [PATCH 80/97] Resolve pylint complaint. --- python/ctsm/site_and_regional/single_point_case.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/python/ctsm/site_and_regional/single_point_case.py b/python/ctsm/site_and_regional/single_point_case.py index 9f709be648..d71d014f36 100644 --- a/python/ctsm/site_and_regional/single_point_case.py +++ b/python/ctsm/site_and_regional/single_point_case.py @@ -200,7 +200,8 @@ def check_dom_pft(self): If any dom_pft is less than 1. Error (ArgumentTypeError): If mixed land units are chosen. - dom_pft values are both in range of (0 - MAX_NAT_PFT) and (MAX_NAT_PFT+1 - MAX_PFT_MANAGEDCROPS). + dom_pft values are both in range of + (0 - MAX_NAT_PFT) and (MAX_NAT_PFT+1 - MAX_PFT_MANAGEDCROPS). """ From 7da3bfd00d31f18b5c4d3acda687f440f3766604 Mon Sep 17 00:00:00 2001 From: Samuel Levis Date: Mon, 23 Jun 2025 12:15:17 -0600 Subject: [PATCH 81/97] Throw error if reseed_dead_plants = .true. in a branch simulation --- bld/CLMBuildNamelist.pm | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/bld/CLMBuildNamelist.pm b/bld/CLMBuildNamelist.pm index 408ff9b573..f2d862fe7c 100755 --- a/bld/CLMBuildNamelist.pm +++ b/bld/CLMBuildNamelist.pm @@ -4478,6 +4478,10 @@ sub setup_logic_cngeneral { "(eg. don't use these options with SP mode)."); } } + if ( &value_is_true($nl->get_value('reseed_dead_plants')) && + &remove_leading_and_trailing_quotes($nl_flags->{'clm_start_type'}) eq "branch") { + $log->fatal_error("reseed_dead_plants MUST be .false. in a branch run in order to match a corresponding continue simulation bit-for-bit"); + } } #------------------------------------------------------------------------------- From d63752e6f7898e3d2ffeeea66f099ffb53ad34f6 Mon Sep 17 00:00:00 2001 From: Samuel Levis Date: Mon, 23 Jun 2025 12:58:00 -0600 Subject: [PATCH 82/97] Simplify the new error message --- bld/CLMBuildNamelist.pm | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/bld/CLMBuildNamelist.pm b/bld/CLMBuildNamelist.pm index f2d862fe7c..346cc733a0 100755 --- a/bld/CLMBuildNamelist.pm +++ b/bld/CLMBuildNamelist.pm @@ -4480,7 +4480,7 @@ sub setup_logic_cngeneral { } if ( &value_is_true($nl->get_value('reseed_dead_plants')) && &remove_leading_and_trailing_quotes($nl_flags->{'clm_start_type'}) eq "branch") { - $log->fatal_error("reseed_dead_plants MUST be .false. in a branch run in order to match a corresponding continue simulation bit-for-bit"); + $log->fatal_error("reseed_dead_plants MUST be .false. in a branch run"); } } From c6b77a5a4612abb1ca0509b800873852984e22d4 Mon Sep 17 00:00:00 2001 From: Samuel Levis Date: Mon, 23 Jun 2025 13:00:09 -0600 Subject: [PATCH 83/97] New test in build-namelist_test.pl confirming no reseed on branch --- bld/unit_testers/build-namelist_test.pl | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/bld/unit_testers/build-namelist_test.pl b/bld/unit_testers/build-namelist_test.pl index ab7cb4edf9..9aa14e8f4a 100755 --- a/bld/unit_testers/build-namelist_test.pl +++ b/bld/unit_testers/build-namelist_test.pl @@ -678,6 +678,10 @@ sub cat_and_create_namelistinfile { namelst=>"soil_decomp_method='None'", phys=>"clm5_0", }, + "reseed with branch" =>{ options=>"-clm_start_type branch -envxml_dir .", + namelst=>"reseed_dead_plants=.true.", + phys=>"clm6_0", + }, "reseed without CN" =>{ options=>" -envxml_dir . -bgc sp", namelst=>"reseed_dead_plants=.true.", phys=>"clm5_0", From 4fd500c39987c5657c3ec63727fce47b527b152f Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:09:22 -0600 Subject: [PATCH 84/97] Move a "don't run on forks" conditional. --- .github/workflows/docs-update-ctsm_pylib.yml | 3 --- .github/workflows/docs-update-dependency-common.yml | 5 +++++ .github/workflows/docs-update-doc-builder.yml | 2 -- 3 files changed, 5 insertions(+), 5 deletions(-) diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml index 64231ee515..fb3c3e054c 100644 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -20,9 +20,6 @@ permissions: contents: read jobs: test-update-dependency: - # Don't run on forks - if: ${{ github.repository == 'ESCOMP/CTSM' }} - name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml with: diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index eee702c62f..522dcc973c 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -15,6 +15,11 @@ on: jobs: compare-docbuilder-vs-ctsmpylib: + # Don't run on forks, because test_container_eq_ctsm_pylib.sh uses + # build_docs_to_publish, which will look for branch(es) that forks + # may not have + if: ${{ github.repository == 'ESCOMP/CTSM' }} + runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index 74827aea11..ffe95f9e29 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -20,8 +20,6 @@ permissions: contents: read jobs: test-update-dependency: - # Don't run on forks - if: ${{ github.repository == 'ESCOMP/CTSM' }} name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml From eb8f83f5d1fc54e612e2babf6b2de05bfcec2e5a Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:12:17 -0600 Subject: [PATCH 85/97] Delete an unnecessary "don't run on forks." --- .github/workflows/docs-update-doc-builder.yml | 2 -- 1 file changed, 2 deletions(-) diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index ffe95f9e29..21964810e5 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -28,8 +28,6 @@ jobs: conda_env_name: ctsm_pylib test-rv-setup: - # Don't run on forks - if: ${{ github.repository == 'ESCOMP/CTSM' }} runs-on: ubuntu-latest steps: From 722d5a517437bf1ea69b92f25d00588346ec3122 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:13:04 -0600 Subject: [PATCH 86/97] Better explain use of one "don't run on forks." --- .github/workflows/docs-omnibus.yml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index 587bc92731..8996b2aabd 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -18,7 +18,8 @@ on: jobs: build-docs-omnibus-test: - # Don't run on forks + # Don't run on forks, because part(s) of omnibus testing script will look for + # branch(es) that forks may not have. if: ${{ github.repository == 'ESCOMP/CTSM' }} runs-on: ubuntu-latest From 3fa38aa607e3b25483cfc37bae896ba91b93726e Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:18:30 -0600 Subject: [PATCH 87/97] docs-update-dependency-common.yml: Set default conda variables. --- .github/workflows/docs-update-ctsm_pylib.yml | 3 --- .github/workflows/docs-update-dependency-common.yml | 4 ++-- .github/workflows/docs-update-doc-builder.yml | 3 --- 3 files changed, 2 insertions(+), 8 deletions(-) diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml index fb3c3e054c..8cd92c56c1 100644 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -22,6 +22,3 @@ jobs: test-update-dependency: name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml - with: - conda_env_file: python/conda_env_ctsm_py.yml - conda_env_name: ctsm_pylib diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index 522dcc973c..f3c88a8b4b 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -6,11 +6,11 @@ on: conda_env_file: required: false type: string - default: "" + default: "python/conda_env_ctsm_py.yml" conda_env_name: required: false type: string - default: "" + default: "ctsm_pylib" secrets: {} jobs: diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index 21964810e5..b6970aba11 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -23,9 +23,6 @@ jobs: name: Tests to run when either docs dependency is updated uses: ./.github/workflows/docs-update-dependency-common.yml - with: - conda_env_file: python/conda_env_ctsm_py.yml - conda_env_name: ctsm_pylib test-rv-setup: From 4a7d4114a3e24a1cc9cdd88cb7eee3d985fbe0e7 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:24:11 -0600 Subject: [PATCH 88/97] docs-update-dependency-common.yml: Add comment re: inputs. --- .github/workflows/docs-update-dependency-common.yml | 3 +++ 1 file changed, 3 insertions(+) diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index f3c88a8b4b..f857aed77f 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -3,6 +3,9 @@ name: Jobs shared by docs workflows that run when a dependency is updated on: workflow_call: inputs: + # Conda is always needed for both jobs in this workflow. Here, + # we set default values for the variables in case the calling + # workflow doesn't provide them. conda_env_file: required: false type: string From 0799aeb824599712e1febedc0dbc9fec5d6498b6 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:37:08 -0600 Subject: [PATCH 89/97] Combine ctsm_pylib docs workflow files. --- .github/workflows/docs-ctsm_pylib.yml | 17 +++++++++++--- .github/workflows/docs-update-ctsm_pylib.yml | 24 -------------------- 2 files changed, 14 insertions(+), 27 deletions(-) delete mode 100644 .github/workflows/docs-update-ctsm_pylib.yml diff --git a/.github/workflows/docs-ctsm_pylib.yml b/.github/workflows/docs-ctsm_pylib.yml index 280e236290..480ae645ef 100644 --- a/.github/workflows/docs-ctsm_pylib.yml +++ b/.github/workflows/docs-ctsm_pylib.yml @@ -1,4 +1,4 @@ -name: Test building docs with ctsm_pylib +name: Docs tests to run when ctsm_pylib is updated on: push: @@ -8,6 +8,7 @@ on: - 'python/conda_env_ctsm_py.txt' - 'doc/ctsm-docs_container/requirements.txt' - '.github/workflows/docs-common.yml' + - '.github/workflows/docs-update-dependency-common.yml' pull_request: # Run on pull requests that change the listed files @@ -15,6 +16,7 @@ on: - 'python/conda_env_ctsm_py.txt' - 'doc/ctsm-docs_container/requirements.txt' - '.github/workflows/docs-common.yml' + - '.github/workflows/docs-update-dependency-common.yml' schedule: # 8 am every Monday UTC @@ -27,14 +29,23 @@ permissions: jobs: test-build-docs-ctsm_pylib: if: ${{ always() }} - name: With ctsm_pylib + name: Build with ctsm_pylib uses: ./.github/workflows/docs-common.yml with: use_conda: true conda_env_file: python/conda_env_ctsm_py.yml conda_env_name: ctsm_pylib - # File an issue if the docs build failed during a scheduled run + test-update-dependency: + if: ${{ always() }} + name: Tests for when either docs dependency is updated + uses: ./.github/workflows/docs-update-dependency-common.yml + + # File an issue if the docs build failed during a scheduled run. + # The main thing we're concerned about in that case is something having + # changed outside the repository that's causing the ctsm_pylib setup to + # fail. Thus, we don't need this job to wait for BOTH the above jobs--- + # if one fails, they both will. file-issue-on-failure: if: | failure() && diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml deleted file mode 100644 index 8cd92c56c1..0000000000 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ /dev/null @@ -1,24 +0,0 @@ -name: Docs tests to run when ctsm_pylib is updated - -on: - push: - # Run when a change to these files is pushed to any branch. Without the "branches:" line, for some reason this will be run whenever a tag is pushed, even if the listed files aren't changed. - branches: ['*'] - paths: - - 'python/conda_env_ctsm_py.txt' - - 'doc/ctsm-docs_container/requirements.txt' - - pull_request: - # Run on pull requests that change the listed files - paths: - - 'python/conda_env_ctsm_py.txt' - - 'doc/ctsm-docs_container/requirements.txt' - - workflow_dispatch: - -permissions: - contents: read -jobs: - test-update-dependency: - name: Tests to run when either docs dependency is updated - uses: ./.github/workflows/docs-update-dependency-common.yml From 345145fb4394a6c641740b73639a31c9c80be2d8 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:37:47 -0600 Subject: [PATCH 90/97] Rename ctsm_pylib docs workflow file. --- .../workflows/{docs-ctsm_pylib.yml => docs-update-ctsm_pylib.yml} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename .github/workflows/{docs-ctsm_pylib.yml => docs-update-ctsm_pylib.yml} (100%) diff --git a/.github/workflows/docs-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml similarity index 100% rename from .github/workflows/docs-ctsm_pylib.yml rename to .github/workflows/docs-update-ctsm_pylib.yml From 67394de4a98744e72c5d506ce85cbefe38a8013c Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 13:48:08 -0600 Subject: [PATCH 91/97] Improve workflow job names. --- .github/workflows/docs-update-ctsm_pylib.yml | 2 +- .github/workflows/docs-update-dependency-common.yml | 2 ++ 2 files changed, 3 insertions(+), 1 deletion(-) diff --git a/.github/workflows/docs-update-ctsm_pylib.yml b/.github/workflows/docs-update-ctsm_pylib.yml index 480ae645ef..865f092f92 100644 --- a/.github/workflows/docs-update-ctsm_pylib.yml +++ b/.github/workflows/docs-update-ctsm_pylib.yml @@ -38,7 +38,7 @@ jobs: test-update-dependency: if: ${{ always() }} - name: Tests for when either docs dependency is updated + name: Docs dependency update tests uses: ./.github/workflows/docs-update-dependency-common.yml # File an issue if the docs build failed during a scheduled run. diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index f857aed77f..9f1d1e3f4e 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -18,6 +18,8 @@ on: jobs: compare-docbuilder-vs-ctsmpylib: + name: Are both methods identical? + # Don't run on forks, because test_container_eq_ctsm_pylib.sh uses # build_docs_to_publish, which will look for branch(es) that forks # may not have From e168ad3da56604f5aed3f22a494ee520ce62eed6 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 14:06:05 -0600 Subject: [PATCH 92/97] Rename doc/testing/ to doc/test/. --- .github/workflows/docs-build-and-deploy.yml | 4 ++-- .github/workflows/docs-omnibus.yml | 6 +++--- .github/workflows/docs-update-dependency-common.yml | 4 ++-- .github/workflows/docs-update-doc-builder.yml | 2 +- .github/workflows/docs.yml | 8 ++++---- .../building-docs-multiple-versions.rst | 2 +- doc/{testing => test}/compose_test_cmd.sh | 0 doc/{testing => test}/test_build_docs_-b.sh | 0 doc/{testing => test}/test_build_docs_-r-v.sh | 0 doc/{testing => test}/test_container_eq_ctsm_pylib.sh | 0 doc/{testing => test}/test_doc-builder_tests.sh | 0 doc/{testing => test}/test_makefile_method.sh | 0 doc/{testing => test}/testing.sh | 0 13 files changed, 13 insertions(+), 13 deletions(-) rename doc/{testing => test}/compose_test_cmd.sh (100%) rename doc/{testing => test}/test_build_docs_-b.sh (100%) rename doc/{testing => test}/test_build_docs_-r-v.sh (100%) rename doc/{testing => test}/test_container_eq_ctsm_pylib.sh (100%) rename doc/{testing => test}/test_doc-builder_tests.sh (100%) rename doc/{testing => test}/test_makefile_method.sh (100%) rename doc/{testing => test}/testing.sh (100%) diff --git a/.github/workflows/docs-build-and-deploy.yml b/.github/workflows/docs-build-and-deploy.yml index 72be23d0f8..1b0c0cb412 100644 --- a/.github/workflows/docs-build-and-deploy.yml +++ b/.github/workflows/docs-build-and-deploy.yml @@ -6,14 +6,14 @@ on: branches: ['master', 'release-clm5.0'] paths: - 'doc/**' - - '!doc/testing/*' + - '!doc/test/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' - - 'doc/testing/test_container_eq_ctsm_pylib.sh' + - 'doc/test/test_container_eq_ctsm_pylib.sh' # Allows you to run this workflow manually from the Actions tab workflow_dispatch: diff --git a/.github/workflows/docs-omnibus.yml b/.github/workflows/docs-omnibus.yml index 8996b2aabd..1c73eb8224 100644 --- a/.github/workflows/docs-omnibus.yml +++ b/.github/workflows/docs-omnibus.yml @@ -5,13 +5,13 @@ on: # Run when a change to these files is pushed to any branch. Without the "branches:" line, for some reason this will be run whenever a tag is pushed, even if the listed files aren't changed. branches: ['*'] paths: - - 'doc/testing/*' + - 'doc/test/*' - 'doc/Makefile' pull_request: # Run on pull requests that change the listed files paths: - - 'doc/testing/*' + - 'doc/test/*' - 'doc/Makefile' workflow_dispatch: @@ -45,4 +45,4 @@ jobs: - name: Text Sphinx builds with omnibus script run: | - cd doc/testing && ./testing.sh + cd doc/test/ && ./testing.sh diff --git a/.github/workflows/docs-update-dependency-common.yml b/.github/workflows/docs-update-dependency-common.yml index 9f1d1e3f4e..a64e1a8ad5 100644 --- a/.github/workflows/docs-update-dependency-common.yml +++ b/.github/workflows/docs-update-dependency-common.yml @@ -47,7 +47,7 @@ jobs: - name: Compare docs built with container vs. ctsm_pylib run: | - cd doc/testing + cd doc/test/ ./test_container_eq_ctsm_pylib.sh makefile-method: @@ -73,5 +73,5 @@ jobs: - name: Check that Makefile method works run: | - cd doc/testing + cd doc/test/ conda run -n ${{ inputs.conda_env_name }} --no-capture-output ./test_makefile_method.sh diff --git a/.github/workflows/docs-update-doc-builder.yml b/.github/workflows/docs-update-doc-builder.yml index b6970aba11..0756ed94c5 100644 --- a/.github/workflows/docs-update-doc-builder.yml +++ b/.github/workflows/docs-update-doc-builder.yml @@ -40,4 +40,4 @@ jobs: - name: build_docs rv method run: | - cd doc/testing && ./test_build_docs_-r-v.sh docker + cd doc/test/ && ./test_build_docs_-r-v.sh docker diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index 47c1da5345..362818eb90 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -7,7 +7,7 @@ on: branches: ['*'] paths: - 'doc/**' - - '!doc/testing/*' + - '!doc/test/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' @@ -15,13 +15,13 @@ on: # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' - - 'doc/testing/test_container_eq_ctsm_pylib.sh' + - 'doc/test/test_container_eq_ctsm_pylib.sh' pull_request: # Run on pull requests that change the listed files paths: - 'doc/**' - - '!doc/testing/*' + - '!doc/test/*' - '!doc/*ChangeLog*' - '!doc/*ChangeSum*' - '!doc/UpdateChangelog.pl' @@ -29,7 +29,7 @@ on: # Include all include::ed files outside doc/ directory! - 'src/README.unit_testing' - 'tools/README' - - 'doc/testing/test_container_eq_ctsm_pylib.sh' + - 'doc/test/test_container_eq_ctsm_pylib.sh' workflow_dispatch: diff --git a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst index 9859a82063..895dbf2a65 100644 --- a/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst +++ b/doc/source/users_guide/working-with-documentation/building-docs-multiple-versions.rst @@ -7,7 +7,7 @@ There is a menu in the lower left of the webpage that lets readers switch betwee Note that this is not necessary in order for you to contribute an update to the documentation. GitHub will test this automatically when you open a PR. But if you'd like to try, this will generate a local site for you in ``_publish/`` and then open it: -.. literalinclude:: ../../../testing/test_container_eq_ctsm_pylib.sh +.. literalinclude:: ../../../test/test_container_eq_ctsm_pylib.sh :start-at: ./build_docs_to_publish :end-before: VERSION LINKS WILL NOT RESOLVE :append: CMD _publish/index.html # where CMD is open for Mac or wslview for Windows (Ubuntu VM) diff --git a/doc/testing/compose_test_cmd.sh b/doc/test/compose_test_cmd.sh similarity index 100% rename from doc/testing/compose_test_cmd.sh rename to doc/test/compose_test_cmd.sh diff --git a/doc/testing/test_build_docs_-b.sh b/doc/test/test_build_docs_-b.sh similarity index 100% rename from doc/testing/test_build_docs_-b.sh rename to doc/test/test_build_docs_-b.sh diff --git a/doc/testing/test_build_docs_-r-v.sh b/doc/test/test_build_docs_-r-v.sh similarity index 100% rename from doc/testing/test_build_docs_-r-v.sh rename to doc/test/test_build_docs_-r-v.sh diff --git a/doc/testing/test_container_eq_ctsm_pylib.sh b/doc/test/test_container_eq_ctsm_pylib.sh similarity index 100% rename from doc/testing/test_container_eq_ctsm_pylib.sh rename to doc/test/test_container_eq_ctsm_pylib.sh diff --git a/doc/testing/test_doc-builder_tests.sh b/doc/test/test_doc-builder_tests.sh similarity index 100% rename from doc/testing/test_doc-builder_tests.sh rename to doc/test/test_doc-builder_tests.sh diff --git a/doc/testing/test_makefile_method.sh b/doc/test/test_makefile_method.sh similarity index 100% rename from doc/testing/test_makefile_method.sh rename to doc/test/test_makefile_method.sh diff --git a/doc/testing/testing.sh b/doc/test/testing.sh similarity index 100% rename from doc/testing/testing.sh rename to doc/test/testing.sh From e0292f427574b0d88a49eac2b4cd334f2b659f42 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 14:11:27 -0600 Subject: [PATCH 93/97] Finish renaming doc/testing/ to doc/test/. --- doc/test/test_build_docs_-b.sh | 2 +- doc/test/test_build_docs_-r-v.sh | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/test/test_build_docs_-b.sh b/doc/test/test_build_docs_-b.sh index 00781685b0..9d25c67489 100755 --- a/doc/test/test_build_docs_-b.sh +++ b/doc/test/test_build_docs_-b.sh @@ -9,7 +9,7 @@ cd "${SCRIPT_DIR}/.." msg="~~~~~ Check that -b works" cmd="./build_docs -b _build -c" -. testing/compose_test_cmd.sh +. test/compose_test_cmd.sh set -x $cmd diff --git a/doc/test/test_build_docs_-r-v.sh b/doc/test/test_build_docs_-r-v.sh index 267e52e53c..b0fe767a2b 100755 --- a/doc/test/test_build_docs_-r-v.sh +++ b/doc/test/test_build_docs_-r-v.sh @@ -9,7 +9,7 @@ cd "${SCRIPT_DIR}/.." msg="~~~~~ Check that -r -v works" cmd="./build_docs -r _build -v latest -c --conf-py-path doc-builder/test/conf.py --static-path ../_static --templates-path ../_templates" -. testing/compose_test_cmd.sh +. test/compose_test_cmd.sh set -x $cmd From 1cdcafa5bb4bd9d13332ff8a389de72b53f35bc4 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 14:13:22 -0600 Subject: [PATCH 94/97] Reword a success message. --- doc/test/test_container_eq_ctsm_pylib.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/test/test_container_eq_ctsm_pylib.sh b/doc/test/test_container_eq_ctsm_pylib.sh index 44185aa9fb..cd2c2917c9 100755 --- a/doc/test/test_container_eq_ctsm_pylib.sh +++ b/doc/test/test_container_eq_ctsm_pylib.sh @@ -30,6 +30,6 @@ cp -a _publish "${d2}" # Make sure container version is identical to no-container version echo "~~~~~ Make sure container version is identical to no-container version" diff -qr "${d1}" "${d2}" -echo "Yep!" +echo "Successful: Docs built with container are identical to those built without" exit 0 From 51c6852df02b1eb746cdfebfed4c921a204076f4 Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 14:19:49 -0600 Subject: [PATCH 95/97] Shorten a workflow title. --- .github/workflows/docker-image-build.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/docker-image-build.yml b/.github/workflows/docker-image-build.yml index 1512daeed6..6d38e12c8b 100644 --- a/.github/workflows/docker-image-build.yml +++ b/.github/workflows/docker-image-build.yml @@ -1,5 +1,5 @@ # Modified from https://docs.github.com/en/packages/managing-github-packages-using-github-actions-workflows/publishing-and-installing-a-package-with-github-actions#publishing-a-package-using-an-action (last accessed 2025-05-09) -name: Test building ctsm-docs Docker image and using it to build the docs +name: Build and test ctsm-docs container # Configures this workflow to run every time a change in the Docker container setup is pushed or included in a PR on: From ab546d508ce209ae342cc39b7f97adccb0270dae Mon Sep 17 00:00:00 2001 From: Sam Rabin Date: Tue, 24 Jun 2025 15:17:17 -0600 Subject: [PATCH 96/97] docs/test/*sh: Explain "set -e". --- doc/test/test_build_docs_-b.sh | 2 ++ doc/test/test_build_docs_-r-v.sh | 2 ++ doc/test/test_container_eq_ctsm_pylib.sh | 2 ++ doc/test/test_doc-builder_tests.sh | 2 ++ doc/test/test_makefile_method.sh | 2 ++ doc/test/testing.sh | 2 ++ 6 files changed, 12 insertions(+) diff --git a/doc/test/test_build_docs_-b.sh b/doc/test/test_build_docs_-b.sh index 9d25c67489..8b49e2f7aa 100755 --- a/doc/test/test_build_docs_-b.sh +++ b/doc/test/test_build_docs_-b.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e cli_tool="$1" diff --git a/doc/test/test_build_docs_-r-v.sh b/doc/test/test_build_docs_-r-v.sh index b0fe767a2b..6f9415b563 100755 --- a/doc/test/test_build_docs_-r-v.sh +++ b/doc/test/test_build_docs_-r-v.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e cli_tool="$1" diff --git a/doc/test/test_container_eq_ctsm_pylib.sh b/doc/test/test_container_eq_ctsm_pylib.sh index cd2c2917c9..729f1b723e 100755 --- a/doc/test/test_container_eq_ctsm_pylib.sh +++ b/doc/test/test_container_eq_ctsm_pylib.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e # Compare docs built with container vs. ctsm_pylib diff --git a/doc/test/test_doc-builder_tests.sh b/doc/test/test_doc-builder_tests.sh index 62c8759587..07cfa73ea1 100755 --- a/doc/test/test_doc-builder_tests.sh +++ b/doc/test/test_doc-builder_tests.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" diff --git a/doc/test/test_makefile_method.sh b/doc/test/test_makefile_method.sh index dd62e770a4..b0fd80984e 100755 --- a/doc/test/test_makefile_method.sh +++ b/doc/test/test_makefile_method.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e cli_tool="$1" diff --git a/doc/test/testing.sh b/doc/test/testing.sh index bd1c1ca530..2e91025e6c 100755 --- a/doc/test/testing.sh +++ b/doc/test/testing.sh @@ -1,4 +1,6 @@ #!/usr/bin/env bash + +# Fail on any non-zero exit code set -e SCRIPT_DIR="$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )" From b2330f436b29e7e51c174f272def18d9f1351f7e Mon Sep 17 00:00:00 2001 From: Samuel Levis Date: Thu, 26 Jun 2025 11:36:16 -0600 Subject: [PATCH 97/97] Update ChangeLog and ChangeSum --- doc/ChangeLog | 80 +++++++++++++++++++++++++++++++++++++++++++++++++++ doc/ChangeSum | 1 + 2 files changed, 81 insertions(+) diff --git a/doc/ChangeLog b/doc/ChangeLog index 0c1927c239..569347dfce 100644 --- a/doc/ChangeLog +++ b/doc/ChangeLog @@ -1,4 +1,84 @@ =============================================================== +Tag name: ctsm5.3.061 +Originator(s): slevis (Samuel Levis,UCAR/TSS,303-665-1310) +Date: Thu 26 Jun 2025 11:28:43 AM MDT +One-line Summary: Merge b4b-dev to master + +Purpose and description of changes +---------------------------------- +PR #3231 Clean up docs workflows +Resolves #3160 +Resolves #3213 + +PR #3272 Throw error if reseed_dead_plants = .true. in a branch simulation +Resolves #3257 + +PR #3264 Fix plumber2_surf_wrapper +Resolves #3262 + +PR #3259 subset_data point: Fix --create-datm and Longitude TypeErrors +Resolves #3258 +Resolves #3260 +Resolves #3197 +Resolves #2960 + +PR #3227 Docs docs: Update Windows instructions +Resolves #3185 + +Significant changes to scientifically-supported configurations +-------------------------------------------------------------- + +Does this tag change answers significantly for any of the following physics configurations? +(Details of any changes will be given in the "Answer changes" section below.) + + [Put an [X] in the box for any configuration with significant answer changes.] + +[ ] clm6_0 + +[ ] clm5_0 + +[ ] ctsm5_0-nwp + +[ ] clm4_5 + + +Bugs fixed +---------- +List of CTSM issues fixed (include CTSM Issue # and description) [one per line]: + Listed along with corresponding PRs in "Purpose and description of changes" above + +Notes of particular relevance for users +--------------------------------------- +Changes to CTSM's user interface (e.g., new/renamed XML or namelist variables): + #3272 Throw error if reseed_dead_plants = .true. in a branch simulation + +Changes to documentation: + #3227 Docs docs: Update Windows instructions + +Testing summary: +---------------- + [PASS means all tests PASS; OK means tests PASS other than expected fails.] + + build-namelist tests (if CLMBuildNamelist.pm has changed): + + derecho - PASS + + regular tests (aux_clm: https://github.com/ESCOMP/CTSM/wiki/System-Testing-Guide#pre-merge-system-testing): + + derecho ----- OK + izumi ------- OK + +Answer changes +-------------- +Changes answers relative to baseline: No + +Other details +------------- +Pull Requests that document the changes (include PR ids): + https://github.com/ESCOMP/ctsm/pull/3283 + +=============================================================== +=============================================================== Tag name: ctsm5.3.060 Originator(s): slevis (Samuel Levis,UCAR/TSS,303-665-1310) Date: Tue 24 Jun 2025 02:13:05 PM MDT diff --git a/doc/ChangeSum b/doc/ChangeSum index 522c814f4a..e10850838e 100644 --- a/doc/ChangeSum +++ b/doc/ChangeSum @@ -1,5 +1,6 @@ Tag Who Date Summary ============================================================================================================================ + ctsm5.3.061 slevis 06/26/2025 Merge b4b-dev to master ctsm5.3.060 slevis 06/24/2025 Preliminary update of ctsm54 defaults (answer changing) ctsm5.3.059 erik 06/23/2025 Bring in various cleanup efforts found in previous testing after the chill changes came in ctsm5.3.058 samrabin 06/16/2025 Fix clm6 compset aliases