Skip to content

Latest commit

 

History

History
35 lines (26 loc) · 1.26 KB

File metadata and controls

35 lines (26 loc) · 1.26 KB

Session of Machine Learning Division of CyberLabs

Conducted on: 28/07/2025

Agenda

ResNet, DenseNet and Kaggle Competition submissions

Summary

  1. ⁠Degradation problem in optimising very deep neural networks.
  2. ⁠Benefit of using skip connections and identity mapping from previous layer.
  3. Training process of ResNet (Stochastic Depth Regularisation)
  4. ⁠Brief overview of Fischer Vectors
  5. ⁠Dense Connectivity (and the benefit of concatenation over summation in this)
  6. ⁠Intuition behind Growth rate, Bottleneck layers and Compression in transition layers.
  7. ⁠Advantage of deterministic connection in DenseNet in preventing overfitting and ensuring good gradient flow.
  8. ⁠Brief discussion on how Residual networks behave like ensembles of relatively shallow networks.
  9. ⁠Analysis of Feature Reuse by observing average absolute filter weights of conv layers in a trained DenseNet.
  10. Training data sources and Performance of DenseNet on various competitive datasets.

Agenda for the next session

  • Squeeze and Excitation Networks
  • MobileNet V1
  • MobileNet V2

Report Compiled by

Ayushman Dutta

Attendees

3rd year: Mukil Sir

2nd year: Anab, Arnav, Arjav, Anukul, Abhishek, Ritesh, Rajat, Sreenandan, Ayushman

Absentees

Second Year: None