CECE (Community Emissions Computing Engine) is a C++20 emissions compute component designed for high performance using Kokkos and ESMF.
CECE is designed as a modular, performance-portable emissions framework. Key components include:
- StackingEngine: Manages the aggregation of emission layers using sophisticated hierarchy-based processing, kernel fusion optimization, and advanced temporal/spatial scaling. See the Stacking Engine Documentation for comprehensive technical details.
- Vertical Distribution: Multiple algorithms for mapping 2D emissions to 3D atmospheric grids with strict mass conservation. See the Vertical Distribution Documentation for complete algorithm descriptions and usage examples.
- PhysicsFactory: A self-registration registry for physics schemes. New schemes should inherit from
BasePhysicsSchemeand use thePhysicsRegistration<T>pattern. - Internal State: Persisted via
CeceInternalDatato maintain field handles and metadata across ESMF phases, avoiding redundant lookups. - TIDE Integration: High-performance data ingestion for external emission inventories via the TIDE (Temporal Interpolation & Data Extraction) library with smart caching and regridding capabilities.
- Language: C++20 and Fortran 2008+.
- Style: Google C++ Style Guide for C++.
- Namespace:
cece::(defined ininclude/cece/cece.hpp). - Documentation: Doxygen format (
/** ... */) required for all public APIs. - Memory: Use
Kokkos::Viewfor data. Avoid raw pointers. - ESMF: Use
ESMC_C API for C++ bridge. Wrap data inKokkos::ViewwithKokkos::MemoryTraits<Kokkos::Unmanaged>. - Performance Portability:
- All compute kernels MUST use Kokkos parallel primitives (
parallel_for,parallel_reduce). - Avoid hardware-specific code (e.g., Cuda intrinsics).
- Use
Kokkos::DefaultExecutionSpacefor dispatch. - Strictly avoid
std::coutor blocking I/O inside kernels.
- All compute kernels MUST use Kokkos parallel primitives (
When implementing or modifying physics schemes:
- Configurability: NEVER hardcode physical constants or tuning factors. All parameters must be read from the YAML
optionsblock inInitialize. - BasePhysicsScheme Helpers: Use provided scientist-friendly helpers:
ResolveImport(name, state): Retrieve input fields.ResolveExport(name, state): Retrieve output fields.MarkModified(name, state): Signal that an export field has been updated.ResolveDiagnostic(name, nx, ny, nz): Register/retrieve diagnostic fields.
- Optimization: Use Horner's Method for evaluating polynomials (e.g., Schmidt numbers, SST scaling) to minimize floating-point operations.
CECE supports multiple vertical distribution methods for mapping 2D emissions to 3D grids:
- SINGLE: Place all emissions in a single specific layer.
- RANGE: Distribute evenly over a range of layer indices.
- PRESSURE: Distribute based on a pressure range (Pa).
- HEIGHT: Distribute based on a height range (m).
- PBL: Distribute within the Planetary Boundary Layer.
- Conservation: All methods ensure strict column mass conservation.
For complete algorithm descriptions, performance characteristics, and usage examples, see the Vertical Distribution Documentation.
CECE uses a comprehensive YAML configuration system that supports:
- Hierarchical emission layer processing with categories and priorities
- Temporal scaling profiles (diurnal, weekly, seasonal)
- Environmental dependencies and dynamic scaling factors
- Multiple vertical distribution algorithms
- TIDE data stream integration with smart caching
- Physics scheme configuration and parameter tuning
For complete configuration reference with all available options and examples, see the Configuration Documentation.
The required development environment is the jcsda/docker-gnu-openmpi-dev:1.9 Docker container. This container provides the necessary compilers, MPI, and ESMF dependencies.
Mocking ESMF or NUOPC is strictly forbidden. All development and verification must be performed using real ESMF dependencies within the JCSDA Docker environment to ensure real-world parity and stability.
-
Run the Setup Script: Execute the provided
setup.shscript to pull the Docker image and drop into a shell../setup.sh
If you encounter Docker overlayfs issues or need to fix the environment, run:
./scripts/fix_docker_and_setup.sh
-
Build: Inside the container:
mkdir build && cd build cmake .. make -j4
-
Run Example Driver: To see CECE in action with external data (ESMF fields):
bash ./example_driver -
Test:
ctest --output-on-failure
The physics scheme generator and other scripts require jinja2, pyyaml, and pytest.
python3 -m pip install jinja2 pyyaml pytest- ESMF User Guide: https://earthsystemmodeling.org/docs/release/latest/ESMF_usrdoc
- ESMF Reference Manual: https://earthsystemmodeling.org/docs/release/latest/ESMF_refdoc/
- NUOPC Reference Manual: https://earthsystemmodeling.org/docs/release/latest/NUOPC_refdoc
- TIDE Documentation: Located at
src/io/tidein the CECE repository - Fortran Standards: Fortran 2008 Standard (ISO/IEC 1539-1:2010)