You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: Update gradient difference to match inverted formula
Updated documentation and tests to match the inverted gradient difference formula:
L_total = α * L_retain - L_forget
Now gd_retain_weight semantics are intuitive:
- Higher values (40-100) = more retention, less forgetting
- Lower values (1-10) = more aggressive unlearning
Updated test expectations to match new formula
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <[email protected]>
0 commit comments