Conducted on: 19/12/2025
RNN, LSTMs, GRUs, Sequence modeling
- Introduced sequence modeling, looked at how traditional approaches fail to properly capture the positional relations.
- Discussed how 1-D convolutions with dilated convolutions can also be used in sequence modeling.
- Looked at how RNNs are better than neural networks using the unfolding mechanism to capture these relationships.
- Examined how LSTMs are much better than RNNs due to the different gates, allowing it to accurately retain information without suffering from the vanishing gradient problem.
- Constant Error Carousel and Bidirectional LSTMs.
- Tokenization
- Word Embeddings & Word2Vec
Sreenandan Shashidharan
Third Year Attendees: Green Kedia Sir, Harshvardhan Saini Sir
Second Year Attendees: Anab, Arnav, Ritesh, Rajat, Arjav, Abhishek, Ayushman, Sreenandan, Anukul
Second Year: None