Knowledge_distillation_codes This repository contains the experimental code for our paper on knowledge distillation. The primary script demonstrates how to train a student model using the logits from a pre-trained teacher model. This code is designed to be flexible and can be readily adapted for use with any student model architecture. To facilitate understanding and custom use, the code is thoroughly commented, allowing users to tune hyperparameters and modify the training process as needed. Additionally, we provide the scripts used to calibrate the student model's logits. These scripts include implementations for Isotonic Regression and Temperature Scaling, with options to optimize the calibration for either the stratified Brier score or log-loss.
-
Notifications
You must be signed in to change notification settings - Fork 1
VasundharaAcharya/Code_Knowledge_Distillation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published