Skip to content

Latest commit

 

History

History
6 lines (6 loc) · 311 Bytes

File metadata and controls

6 lines (6 loc) · 311 Bytes

Micrograd

A simple automatic differentiation engine using backpropagation over dynamically built DAG.
An example of usage (with multi layered perceptron) is shown in src/simpl_nn_example.py
Made with help of Andrej Karpathy's: spelled-out intro to neural networks and backpropagation.

License

MIT