Skip to content

Latest commit

 

History

History
24 lines (19 loc) · 913 Bytes

README.md

File metadata and controls

24 lines (19 loc) · 913 Bytes

jaxbolt

A minimal JAX-based autograd engine implementation, inspired by Andrej Karpathy's micrograd. This is a learning project to understand the internals of automatic differentiation and neural network frameworks.

Features

  • JAX-based tensor operations
  • Automatic differentiation
  • Basic neural network operations (relu, sigmoid, tanh)
  • Memory-efficient operations with chunking and checkpointing
  • Multi-device support

Why?

Built this to learn about:

  • How autograd engines work
  • JAX's approach to automatic differentiation
  • Efficient tensor operations
  • GPU acceleration and parallel processing

Feedback Welcome!

This is a learning project and I'm always looking to improve. Feel free to suggest improvements or share your thoughts:

Acknowledgments