Skip to content

Releases: arm/neural-graphics-model-gym

v0.2.0

28 Nov 14:56
8c980ee

Choose a tag to compare

New Features

Bring Your Own Model (BYOM) Support

You can now register custom neural graphics models and datasets that plug into the ng-model-gym’s training, QAT, evaluation and export flows.

Added

  • Abstract base classes: BaseNGModel and BaseNGModelWrapper for creating neural graphics models
  • Model and dataset registries with decorators for auto-discovery and loading during ng-model-gym startup
  • Ability to export custom metadata alongside the VGF file
  • Configurable schedulers, optimizers, metrics, and losses specified in the config file
  • Safetensors cropper script
  • Best performing checkpoint during training is now highlighted and saved

Changed

  • Refactored repository into /core (shared logic) and /usecase folders
  • Dependency updates:
    • ExecuTorch 1.0.0
    • PyTorch 2.9.0
    • TorchAO 0.14.0
  • BaseModelEvaluator renamed to NGModelEvaluator
  • User-defined static and dynamic model export shape instead of using dataset shape
  • Dockerfile base now uses: nvidia/cuda:12.8.0-devel-ubuntu22.04
  • History buffers moved from the NSSModel to FeedbackModel
  • FeedbackModel attribute renamed from nss_modelng_model
    • Pre v0.2.0 NSS checkpoints can still be loaded.

Breaking Changes

  • Configuration file schema changed. To generate an updated config file run:
    ng-model-gym init
    

Removed

  • The bundled model-converter binary is no longer included.
  • Wheels now depend on the ai-ml-sdk-model-converter package from PyPI instead.

Documentation

  • Added a CONTRIBUTING guide
  • General updates and improvements to the README
  • Added data capture documentation

v0.1.0

08 Aug 15:12

Choose a tag to compare

Features

  • Neural Super Sampling (NSS)

    • A trainable upscaling model for real-time graphics on mobile devices with Neural Accelerators.

  • Training & Evaluation

    • FP32 and Quantization-Aware Training (QAT INT8) modes

    • Train from scratch or finetune from pre-trained weights

    • Model quality evaluation across a range of metrics

  • Export to VGF

    • Uses ExecuTorch with the Arm backend to export models to VGF file

  • CLI and Python API

    • Choose between the ng-model-gym command-line interface or import the Python package in your own code

  • Profiling & Visualization

    • PyTorch profiler instrumentation

    • TensorBoard integration for monitoring training 


Dependencies & Sources

Note: We rely on a nightly build of ExecuTorch and non-PyPI wheels for PyTorch and TorchVision


Package Version / Channel Source URL
Python 3.10, 3.11, 3.12
torch 2.7.1+cu118 https://download.pytorch.org/whl/cu118/torch-2.7.1%2Bcu118-cp311-cp311-manylinux_2_28_x86_64.whl
torchvision 0.22.1+cu118 https://download.pytorch.org/whl/cu118/torchvision-0.22.1%2Bcu118-cp311-cp311-manylinux_2_28_x86_64.whl
executorch 0.8.0.dev20250702+cpu (nightly, unstable) https://download.pytorch.org/whl/nightly/cpu/executorch-0.8.0.dev20250702%2Bcpu-cp311-cp311-manylinux_2_28_x86_64.whl
tosa_serialization_lib commit 6454bc0f (v1.0-branch) git+https://git.gitlab.arm.com/tosa/tosa-serialization.git@6454bc0fhttps://git.gitlab.arm.com/tosa/tosa-serialization/-/tree/6454bc0fef8404a58cbfc2eaa6bcad4b17910795