optimgrad is a lightweight, educational automatic differentiation engine built in pure Python with NumPy. It provides a PyTorch-like API for building and training neural networks, with a focus on mathematical correctness. The engine implement (backpropagation) and includes various utilities for deep learning.
- Automatic Differentiation: Reverse-mode autodiff with dynamic computational graphs
- Tensor Operations: Broadcasting, element-wise operations, and matrix operations
- Activation Functions: ReLU, Sigmoid, Tanh with proper gradient computation
- Loss Functions: MSE and Binary Cross-Entropy with numerical stability
- Optimizers: SGD and Adam with configurable hyperparameters
- Utilities:
- Gradient clipping
- Gradient flow monitoring
- Bound estimation
- Computational graph visualization
- Chain rule walkthrough
- Gradient path explanation
git clone https://github.com/yourusername/optimgrad.git
cd optimgrad
pip install -r requirements.txt**not many dependencies tbh
from engine import Tensor
from loss import mse_loss
# Create tensors with gradients
x = Tensor([2.0], requires_grad=True)
y = Tensor([3.0], requires_grad=True)
# Forward pass
z = x * y + x**2 # Computational graph is built automatically
# Backward pass
z.backward()
# Access gradients
print(x.grad) # dy/dx = y + 2x = 3 + 4 = 7
print(y.grad) # dy/dy = x = 2The fundamental building block with support for:
- Automatic gradient computation
- Broadcasting operations
- In-place operation detection
- Computational graph tracking
class Tensor:
def __init__(self, data, requires_grad=False):
self.data = np.array(data)
self.grad = np.zeros_like(self.data)
self.requires_grad = requires_grad
self._backward_fn = lambda: None
self.children = ()
self._in_graph = FalseBuilt-in activation functions with their derivatives:
# Clip gradients to have a maximum norm
clip_gradients(params, max_norm=1.0)# Estimate output bounds of a function
lower, upper = estimate_bounds(f, x, eps=1e-4)The project includes comprehensive tests for all components:
python test.py
python test_scientific.pyContributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Inspired by Karpathy's micrograd and PyTorch's autograd system
- Built with educational purposes in mind
If you use optimgrad in your research, please cite:
@software{optimgrad2025,
author = {Your Name},
title = {optimgrad: A lightweight and mathematical autograd engine},
year = {202x},
publisher = {GitHub},
url = {https://github.com/yourusername/optimgrad}
}