Skip to content

Commit f1c6570

Browse files
Kyle1668claude
andcommitted
feat: Implement gradient difference unlearning support
- Added gd_mode and gradient difference parameters to neox_args - Modified training.py to support gradient difference in interleaved mode - Updated data_utils.py to load retain dataset for gradient difference - Modified eval.py for PyTorch weights_only parameter compatibility - Updated test_gradient_ascent.py to include gradient difference tests - Cleaned up temporary analysis files and scripts Current implementation uses gd_retain_weight as forget weight multiplier (L_total = L_retain - α * L_forget), which will be inverted next 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
1 parent de5a3aa commit f1c6570

30 files changed

+465
-3553
lines changed

9vbuhj4o_analysis.png

-355 KB
Binary file not shown.

9vbuhj4o_analysis.txt

Lines changed: 0 additions & 50 deletions
This file was deleted.

EARLY_GA_TREND_ANALYSIS.md

Lines changed: 0 additions & 97 deletions
This file was deleted.

GA_57_TO_1_CONFIG_OPTIONS.md

Lines changed: 0 additions & 107 deletions
This file was deleted.

GA_BUG_ANALYSIS.md

Lines changed: 0 additions & 79 deletions
This file was deleted.

0 commit comments

Comments
 (0)