A minimal autograd engine and neural network built from scratch, inspired by Andrej Karpathy's "micrograd" walkthrough. https://www.youtube.com/watch?v=VMj-3S1tku0
This project implements the core components of automatic differentiation and simple neural network training—without using deep learning libraries like PyTorch or TensorFlow.
-
Custom Value class for scalar automatic differentiation
-
Support for backpropagation via computation graphs
-
Implementation of a basic neural network (MLP)
-
Training loop for simple supervised tasks
-
Lightweight and educational—ideal for learning the internals of autograd and deep learning