Skip to content

qianglbl/mAD

Repository files navigation

mAD

mAD: a multi-language module/class for forward automatic differentiation (AD)


*** Copyright Notice ***

A multi-language auto differentiation package (mAD) Copyright (c) 2025, The Regents of the University of California, through Lawrence Berkeley National Laboratory (subject to receipt of any required approvals from the U.S. Dept. of Energy). All rights reserved.

If you have questions about your rights to use or distribute this software, please contact Berkeley Lab's Intellectual Property Office at [email protected].

NOTICE. This Software was developed under funding from the U.S. Department of Energy and the U.S. Government consequently retains certain rights. As such, the U.S. Government has been granted for itself and others acting on its behalf a paid-up, nonexclusive, irrevocable, worldwide license in the Software to reproduce, distribute copies to the public, prepare derivative works, and perform publicly and display publicly, and to permit others to do so.


This package contains an Automatic Differentiation (AD) module/class, which has been implemented in five programming languages: Fortran, C++, Java, Python, and Julia. The package aims to provide transparency and simplicity for integrating auto-differentiation into various applications. To utilize this module, some standard variables must be declared as AD variables.

Please note that for each application, the dimmax variable within the module must be set to the total number of variables intended for differentiation. We have included three examples, implemented in all five languages, to demonstrate how to use the module effectively.


Contact: Ji Qiang ([email protected])


Citation: J. Qiang, Y. Hao, A. Qiang, J. Wan, “A module for fast auto differentiable simulations” in Proc. of IPAC25, WEBN2, p.1671, 2025.