About:
This project provides a hands-on implementation of fundamental neural network architectures using PyTorch. It covers:
- Linear and Logistic Regression: Demonstrates how these classic algorithms can be represented as single-layer neural networks.
- Multilayer Perceptron (MLP): Builds a 3-layer MLP for image classification on the MNIST dataset.
- Convolutional Neural Networks (CNNs):
- Implements the 2D convolution operation from scratch.
- Shows how to learn a convolutional kernel for edge detection.
- Implements the ResNet18 architecture for image classification.
This project is ideal for anyone learning about neural networks and wanting to gain a deeper understanding of their inner workings.
Usage:
- Synthetic Data Regression: Run the code in section 1.1 to generate synthetic data and train a linear regression model.
- MNIST Classification:
- Run the code in section 1.2 to train a single-layer neural network for MNIST classification.
- Run the code in section 2 to train a 3-layer MLP for MNIST classification.
- Convolutional Neural Networks:
- Run the code in section 3.1 to experiment with the
corr2d
function and understand the convolution operation. - Run the code in section 3.2 to learn a kernel for edge detection.
- Run the code in section 3.3 to train the ResNet18 model on the MNIST dataset.
- Run the code in section 3.1 to experiment with the
Key Features:
- From Scratch Implementation: Provides a deeper understanding by implementing core components manually.
- Multiple Architectures: Covers a range of neural network architectures, from basic to advanced.
- Clear Explanations: Includes comments and explanations within the code to guide understanding.
- Visualization: Uses matplotlib to visualize training progress and results.
Contributor: AISHWARYA NAYAK (Contributions are welcome! Feel free to open issues or submit pull requests.)