Skip to content

spline2hg/minigrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

minigrad

A minimal implementation of reverse-mode automatic differentiation (a.k.a. autograd / backpropagation) in pure Python.

Inspired by Andrej Karpathy's micrograd,and minigrad

Overview

Create a Value object.

a = Value(1.5)

Do some calculations.

b = Value(-4.0)
c = a**3 / 5
d = c + (b**2).tanh()

Compute the gradients.

d.backward()

Plot the computational graph.

draw_graph(d)

Repo Structure

  1. learnings/minigrad.ipynb: notebook of me exploring miningrad.
  2. minigrad/engine.py: This has the value object and the operations available on it.
  3. minigrad/nn.py: The logic for layers and MLP.
  4. minigrad/visualize.py: This just draws nice-looking computational graphs.

Implementation

MiniGrad is a simple autograd implementation, using no external modules.

The entirety of the auto-differentiation logic lives in the Value class in engine.py.

A Value wraps a float/int and overrides its arithmetic magic methods in order to:

  1. Stitch together a define-by-run computational graph when doing arithmetic operations on a Value
  2. Hard code the derivative functions of arithmetic operations
  3. Keep track of ∂self/∂parent between adjacent nodes
  4. Compute ∂output/∂self with the chain rule on demand (when .backward() is called)

Not in Scope

This project is just for fun and personal learning, so for anything other than basic implementation refer original source:

About

python implementation of autograd

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published