Skip to content

VasundharaAcharya/Code_Knowledge_Distillation

Repository files navigation

Code_Knowledge_Distillation

Knowledge_distillation_codes This repository contains the experimental code for our paper on knowledge distillation. The primary script demonstrates how to train a student model using the logits from a pre-trained teacher model. This code is designed to be flexible and can be readily adapted for use with any student model architecture. To facilitate understanding and custom use, the code is thoroughly commented, allowing users to tune hyperparameters and modify the training process as needed. Additionally, we provide the scripts used to calibrate the student model's logits. These scripts include implementations for Isotonic Regression and Temperature Scaling, with options to optimize the calibration for either the stratified Brier score or log-loss.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published