This guide covers the prerequisites, build process, configuration, and execution of the CECE component.
To build CECE, you need the following dependencies:
- C++20 Compiler (GCC 10+, Clang 12+)
- CMake (3.20+)
- Kokkos (4.0+)
- ESMF (8.0+)
- MPI (OpenMPI, MPICH, etc.)
- yaml-cpp (0.7+)
- TIDE (Temporal Interpolation & Data Extraction)
- NetCDF (C and Fortran interfaces)
- Python 3.8+ (for scripts and testing)
The easiest way to get started is using the JCSDA development container, which comes with all dependencies pre-installed.
- Run the Setup Script:
./setup.sh
- Activate the Environment:
Inside the container, ensure the Spack environment is active:
source /opt/spack-environment/activate.sh
If you encounter overlayfs errors or other Docker-related environment issues when running the setup script, you can use the provided fix utility:
./scripts/fix_docker_and_setup.sh# Inside the JCSDA Docker container
source /opt/spack-environment/activate.sh
mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)| Option | Description | Default |
|---|---|---|
CMAKE_BUILD_TYPE |
Build type (Release, Debug) | Release |
Kokkos_ENABLE_SERIAL |
Enable Serial execution space | ON |
Kokkos_ENABLE_OPENMP |
Enable OpenMP multi-core support | ON |
Kokkos_ENABLE_CUDA |
Enable NVIDIA GPU support | OFF |
Kokkos_ENABLE_HIP |
Enable AMD GPU support | OFF |
Example for targeting NVIDIA GPUs:
cmake .. -DKokkos_ENABLE_CUDA=ON -DKokkos_ARCH_AMPERE80=ONExample for CPU-only with OpenMP:
cmake .. -DKokkos_ENABLE_SERIAL=ON -DKokkos_ENABLE_OPENMP=ONCECE is configured using a YAML file that defines emission species, data sources, scaling factors, and processing parameters. The configuration system is built around the powerful Stacking Engine that combines multiple emission layers with sophisticated hierarchy and scaling rules.
For complete configuration reference with all available options, see the Configuration Documentation.
For technical details about how the Stacking Engine processes these configurations, see the Stacking Engine Documentation.
# Driver timing configuration
driver:
start_time: "2020-01-01T00:00:00"
end_time: "2020-01-01T06:00:00"
timestep_seconds: 3600
# Computational grid specification
grid:
nx: 144
ny: 91
lon_min: -180.0
lon_max: 177.5
lat_min: -90.0
lat_max: 90.0
# Species with hierarchical emission layers
species:
co:
- field: "global_co_inventory"
operation: "add"
scale: 1.0
category: "anthropogenic"
hierarchy: 1
nox:
- field: "surface_nox"
operation: "add"
category: "anthropogenic"
hierarchy: 1
vdist_method: "PBL" # Distribute in boundary layer
- field: "aircraft_nox"
operation: "add"
category: "transportation"
hierarchy: 1
vdist_method: "HEIGHT" # Distribute by altitude
vdist_h_start: 9000.0 # 9-12 km cruise altitude
vdist_h_end: 12000.0
# Physics schemes for process-based emissions
physics_schemes:
- name: "sea_salt"
language: "cpp"
options:
r_sala_min: 0.01
r_sala_max: 0.5
# TIDE data streams for external inventories
cece_data:
streams:
- name: "GLOBAL_INVENTORY"
file: "/data/inventories/global_emissions.nc"
yearFirst: 2020
yearLast: 2020
yearAlign: 2020
taxmode: "cycle"
variables:
- file: "CO_total"
model: "global_co_inventory"
# Diagnostic and output configuration
diagnostics:
output_interval_seconds: 3600
variables: ["co", "nox"]
output:
enabled: true
directory: "./output"
filename_pattern: "cece_{YYYY}{MM}{DD}_{HH}.nc"
frequency_steps: 1
fields: ["co", "nox"]-
Hierarchical Processing: Layers within categories are processed by hierarchy (higher numbers take precedence)
-
Operations:
add(accumulate),replace(override),multiply(scale) -
Vertical Distribution: Multiple algorithms for mapping 2D emissions to 3D grids
-
Temporal Scaling: Diurnal, weekly, and seasonal variation profiles
-
Environmental Dependencies: Dynamic scaling based on meteorological fields
-
TIDE Integration: External data ingestion with smart caching and regridding operation: add file: /data/emissions/CEDS_CO_2020.nc variable: CO_emis vertical_distribution: method: SINGLE layer: 0 scale_factors: - name: temporal_scale file: /data/scales/diurnal_co.nc variable: DIURNAL_SCALE masks: - name: land_mask file: /data/masks/land_mask.nc variable: LAND_MASK
- name: biogenic_isop
species: ISOP
hierarchy: 1
operation: add
file: /data/emissions/MEGAN_ISOP_2020.nc
variable: ISOP_emis
vertical_distribution:
method: PBL
scale_factors:
- name: temperature_scale type: computed formula: "exp(0.1 * (T - 298.15))"
- name: biogenic_isop
species: ISOP
hierarchy: 1
operation: add
file: /data/emissions/MEGAN_ISOP_2020.nc
variable: ISOP_emis
vertical_distribution:
method: PBL
scale_factors:
physics_schemes:
-
name: DMS enabled: true options: emission_factor: 1.0e-6 temperature_threshold: 273.15
-
name: Dust enabled: true options: dust_source_strength: 1.0
cece_data: streams_yaml: /path/to/streams.yaml data_root: /data/emissions
output: directory: ./cece_output filename_pattern: "cece_{YYYY}{MM}{DD}_{HH}{mm}{ss}.nc" frequency_steps: 1 fields: - CO - NOx - ISOP diagnostics: false
diagnostics: enabled: true output_interval: 3600 # seconds fields: - intermediate_emissions - scale_factors_applied
## TIDE Streams Configuration
TIDE streams are configured in YAML format. Example `streams.yaml`:
streams:
- name: anthro_emissions
file_paths:
- /data/emissions/CEDS_CO_anthro_2020.nc
variables:
- name_in_file: CO_emis
name_in_model: CEDS_CO
taxmode: cycle
tintalgo: linear
yearFirst: 2020
yearLast: 2020
yearAlign: 2020
- name: biogenic_emissions
file_paths:
- /data/emissions/MEGAN_ISOP_2020.nc
variables:
- name_in_file: ISOP_emis
name_in_model: MEGAN_ISOP
taxmode: cycle
tintalgo: linear
yearFirst: 2020
yearLast: 2020
yearAlign: 2020
The standalone NUOPC driver (cece_nuopc_driver) demonstrates the standard NUOPC lifecycle and how to manage CECE as a child model.
-
Configure: Edit
cece_config.yamlto specify your species, layers, and simulation parameters. The driver can be controlled via adriverblock incece_config.yaml:driver: nx: 72 ny: 46 nz: 1 start_year: 2024 start_month: 1 start_day: 1 start_hour: 0 stop_year: 2024 stop_month: 1 stop_day: 2 stop_hour: 0 timestep_seconds: 3600
-
Build: The driver is built as part of the main project.
-
Run:
cd build ./bin/cece_nuopc_driver
CECE also provides a simpler example_driver for basic C++ integration tests.
- Configure: Edit
cece_config.yamlto specify your species and layers. - Run:
cd build ./example_driver
cd build
ctest --output-on-failure# Unit tests only
ctest -L "unit" --output-on-failure
# Integration tests
ctest -L "integration" --output-on-failure
# HEMCO parity tests
ctest -L "hemco" --output-on-failure# Run a specific test
ctest -R test_driver_configuration --output-on-failure
# Run tests matching a pattern
ctest -R "driver" --output-on-failureProblem: CMake cannot find ESMF
CMake Error: Could not find ESMF
Solution: Set the ESMF_ROOT environment variable:
export ESMF_ROOT=/path/to/esmf
cmake ..Problem: Kokkos compilation fails
error: Kokkos requires C++17 or later
Solution: Ensure your compiler supports C++20:
cmake .. -DCMAKE_CXX_COMPILER=g++-10Problem: TIDE fails to read streams file
Error: Cannot open streams file /path/to/streams.yaml
Solution: Verify the streams file path is correct and the file exists:
ls -la /path/to/streams.txtProblem: ESMF field not found
Error: Field 'CO' not found in ImportState
Solution: Verify the field name matches the YAML configuration and is provided by the coupling component.
Problem: Slow execution on GPU
GPU kernel execution is slower than CPU
Solution:
- Verify GPU is being used: Check Kokkos initialization output
- Profile the code: Use Kokkos profiling tools
- Check memory bandwidth: Ensure data is not being copied unnecessarily
For multi-core CPU execution, set the number of OpenMP threads:
export OMP_NUM_THREADS=16
./build/bin/cece_nuopc_driverFor GPU execution, set the device ID:
export CECE_DEVICE_ID=0
./build/bin/cece_nuopc_driverWhen running in standalone mode with output enabled, CECE writes NetCDF files to the configured output directory. Files follow the naming pattern:
cece_YYYYMMDD_HHmmss.nc
Each file contains:
- Time coordinate variable (seconds since start time)
- All configured emission species fields
- Optional diagnostic fields (if enabled)
Files are CF-1.8 compliant and can be inspected with standard tools:
ncdump -h cece_20240101_000000.ncCECE includes a suite of process-based physics schemes for computing emissions from natural and anthropogenic sources. Each scheme is available as both a native C++ (Kokkos) implementation and a Fortran bridge variant. Schemes are enabled and configured through the physics_schemes block in your YAML configuration.
| Scheme | Description | Documentation |
|---|---|---|
| DMS | Dimethyl sulfide sea-air exchange fluxes | DMS |
| Sea Salt | Size-resolved sea salt aerosol emissions (Gong 2003) | Sea Salt |
| Dust (Ginoux Legacy) | Single-bin mineral dust emissions (Ginoux 2001) | Dust |
| Ginoux (GOCART2G) | Multi-bin dust emissions with Marticorena threshold | Ginoux |
| FENGSHA | Physically-based saltation dust model with Fécan correction | FENGSHA |
| K14 | Kok et al. (2014) dust scheme with full soil physics | K14 |
| MEGAN | Biogenic isoprene emissions (Model of Emissions of Gases and Aerosols from Nature) | MEGAN |
| Lightning NOx | Lightning-produced NOx from convective cloud top height | Lightning NOx |
| Soil NOx | Soil NOx from microbial nitrification/denitrification | Soil NOx |
| Volcano | Volcanic SO₂ point-source emissions with vertical distribution | Volcano |
For guidance on developing your own physics schemes, see the Physics Scheme Development Guide.
CECE provides Python bindings through pybind11 that expose the full C++ core to Python with type-safe access, automatic memory management, and zero-copy NumPy interop.
import cece
import numpy as np
config = cece.load_config("cece_config.yaml")
cece.initialize(config)
state = cece.CeceState(nx=144, ny=96, nz=72)
state.add_import_field("TEMPERATURE", np.asfortranarray(np.zeros((144, 96, 72))))
cece.compute(state, hour=12, day_of_week=3, month=7)
co_emissions = state.get_export_field("CO_EMIS")
cece.finalize()To build with Python support, enable the BUILD_PYTHON_BINDINGS CMake option:
cmake .. -DBUILD_PYTHON_BINDINGS=ON
make -j$(nproc)For the full API reference, configuration management, state handling, NumPy zero-copy details, and migration notes, see the Python Bindings Documentation.
- Read the Developer Guide for architecture details
- Check Physics Scheme Development for adding new schemes
- Review HEMCO Migration Guide for migrating from HEMCO
- See Examples for common use cases
- Explore the Python Bindings for scripting and integration
- Browse individual physics scheme docs for algorithm details and configuration