Adds fast gradient clipping support for the Embedding layer. (
#694 )
Add per-sample gradient norm computation as a functionality (
#724 )
Extend test timeout (
#727 )
Fix failing Github tests (
#726 )
Adaptive Clipping (with Ghost Clipping) (
#711 )
Improve documentation of Github and Website (
#723 )
Modifying DPLossFastGradientClipping to add support for generative ta…
Adding PyPI downloads information (
#721 )
Fix broken
coveralls
(
#713 )
Edits to isort and contribution instructions (
#714 )
Add **kwargs to all optimizer classes (
#710 )
Generate the status badge on Github using Github Actions (
#712 )
Website and Github update (
#677 )
Fixing Ghost Clipping with Batch Memory Manager
Add *kwargs to get_epsilon (
#704 )
Padding on the research folder (D67400804) (
#705 )
Remove **kwargs from optim_class initialization (
#702 )
Add research folder (
#700 )
Separate function for preparing criterion in PrivacyEngine (
#703 )
Delete CircleCI configs since GithubActions CI are now live (
#701 )
Add LoRA to the BERT fine-tuning tutorial (
#698 )
Fix
torch.load()
in model_utils.py (
#696 )
You can’t perform that action at this time.