Skip to content

m2lines/Spinup-Evaluation

Repository files navigation

Spinup-Evaluation

This repository contains code for benchmarking the machine learning spin-up of ocean models. It is designed to pair with Spinup-Forecast, which provides the machine learning models for accelerating the spin-up process for NEMO/DINO. The goal of this evaluation is to assess the performance of the spin-up process in terms of stability and convergence.

The evaluation is performed using the main.py script, which calls a set of metrics defined in the metrics.py file. The results are saved in a .txt file.

The API is as follows:

  • main.py: The main script to run the evaluation.
  • src/metrics.py: Contains the definitions of the metrics used for evaluation.
  • src/utils.py: Contains utility functions for data processing and visualization.

main.py is the entry point for the evaluation process. It takes the following command-line arguments:

  • --restart: Path to model restart file.
  • --mesh-mask: The name of the mesh mask file.
  • --output : The name of the output file where the metrics are stored. The default is metric_results.txt.

Usage [TODO]

This repo is a WIP and usage is subject to change. Figure 1 below shows how the evaluation procedure works in Spinup-Evaluation.

NEMO flow

Fig 1. Evaluation flow diagram

Spinup-Evaluation was developed to assess the DINO configuration of NEMO, but new metrics can be added to metrics.py to make it compatible with any ocean model. See Adding New Metrics for details.

Running on Saved Restart File

To evaluate a state obtained from a checkpoint, run Spinup-Evaluation as follows.

python main.py \
  --restart <path-to-restart.nc> \
  --mesh-mask <path-to-mesh_mask.nc> \
  --output <path-to-output>

Running on Predictions [TODO]

To evaluate a new spin-up state obtained using Spinup-Forecast do the following:

  • --predictions: The path to the directory containing the new pred_[variable].npy spin-up states from Spinup-Forecast.
    • pred_so.npy
    • pred_thetao.npy
    • pred_zos.npy

Installation

To install Spinup-Evaluation, clone the repository and create a virtual environment:

git clone https://github.com/m2lines/Spinup-Evaluation.git
cd Spinup-Evaluation
python -m venv venv
source venv/bin/activate  # On Windows use `venv\Scripts\activate`

Then, install the required packages:

pip install -e .

For a development install, some further steps are recommended:

cd Spinup-Evaluation

# Install optional dev dependencies
pip install -e .[dev]

# Configure pre-commit hooks
pre-commit install

Adding New Metrics [TODO]

To add new metrics to the evaluation, modify the metrics.py file. Further guidance will be provided in the future.

Testing [TODO]

Tests are provided in the tests directory. To run the tests, use the following command:

pytest tests/

Restarting NEMO/DINO [TODO]

When running the metrics on updated predictions, you can also provide the --restart flag to the main.py script, referencing the old restart file. This will provide an updated restart in a format that can be used for restarting the model, prepended with "NEW".

See this Github Gist for more information on steps involved https://gist.github.com/ma595/bf2b977593171d7e2cd840dd4b452ead

See Spinup-Forecast for generating spin-up predictions used as input here.

Acknowledgements

This work builds on significant contributions by Etienne Meunier, whose efforts on the Metrics-Ocean repository laid the foundation for several components used here.

About

Evaluate the ML emulated state produced by spinup-forecast for the NEMO ocean model

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages