A minimal implementation of reverse-mode automatic differentiation (a.k.a. autograd / backpropagation) in pure Python.
Inspired by Andrej Karpathy's micrograd,and minigrad
Create a Value object.
a = Value(1.5)Do some calculations.
b = Value(-4.0)
c = a**3 / 5
d = c + (b**2).tanh()Compute the gradients.
d.backward()Plot the computational graph.
draw_graph(d)learnings/minigrad.ipynb: notebook of me exploring miningrad.minigrad/engine.py: This has the value object and the operations available on it.minigrad/nn.py: The logic for layers and MLP.minigrad/visualize.py: This just draws nice-looking computational graphs.
MiniGrad is a simple autograd implementation, using no external modules.
The entirety of the auto-differentiation logic lives in the Value class in engine.py.
A Value wraps a float/int and overrides its arithmetic magic methods in order to:
- Stitch together a define-by-run computational graph when doing arithmetic operations on a
Value - Hard code the derivative functions of arithmetic operations
- Keep track of
∂self/∂parentbetween adjacent nodes - Compute
∂output/∂selfwith the chain rule on demand (when.backward()is called)
This project is just for fun and personal learning, so for anything other than basic implementation refer original source:
