Skip to content

Conversation

@debanganghosh08
Copy link

This adds an example of Scale-then-Privatize adaptive optimization as detailed in Research paper, "On Design Principles for Private Adaptive Optimizers".

The Problem Addressed: Standard private adaptive optimizers add spherical noise to gradients before preconditioning, which creates a "noise floor" that destroys the benefits of adaptivity.

✔️ The Solution: I have implemented Algorithm 8 from the paper. Using the pre_clipping_transform hook in jax_privacy, gradients are rescaled into a "spherical" space before the sensitivity clip. The noised aggregate is then un-scaled, ensuring the effective noise distribution is aligned with the gradient geometry rather than being uniform.

✔️ Correctness Verification: I verified the implementation by monitoring the running_variance PyTree. The logs confirm that the preconditioner successfully departs from its initial state (1.0) to adapt to the data geometry, even in the presence of noise.

@debanganghosh08 debanganghosh08 changed the title [Example] Scale-then-Privatize Adaptive Optimization (#88) [Example] Scale-then-Privatize Adaptive Optimization for Issue #88 - Paper 3 Jan 21, 2026
@debanganghosh08 debanganghosh08 changed the title [Example] Scale-then-Privatize Adaptive Optimization for Issue #88 - Paper 3 Scale-then-Privatize Adaptive Optimization for Issue #88 - Paper 3 | Split PR 3 Jan 21, 2026
@debanganghosh08 debanganghosh08 changed the title Scale-then-Privatize Adaptive Optimization for Issue #88 - Paper 3 | Split PR 3 Scale-then-Privatize Adaptive Optimization (#88) Split PR 3 Jan 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant