You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<spanclass="sd"> Add logs to track gradient sample mode for each part of the module, including 1) Ghost Clipping, 2) Fast Gradient Clipping (hook mode), and 3) Fast Gradient Clipping (functorch mode).</span>
296
+
297
+
<spanclass="sd"> Args:</span>
298
+
<spanclass="sd"> module: nn.Module to be checked</span>
299
+
<spanclass="sd"> force_functorch: If set to ``True``, will use functorch to compute</span>
300
+
<spanclass="sd"> all per sample gradients. Otherwise, functorch will be used only</span>
301
+
<spanclass="sd"> for layers without registered grad sampler methods.</span>
302
+
<spanclass="sd"> use_ghost_clipping: If set to ``True``, Ghost Clipping</span>
303
+
<spanclass="sd"> will be used for clipping gradients of supported layers. If ``False``, Fast</span>
304
+
<spanclass="sd"> Gradient Clipping will be used for all layers.</span>
<spanclass="sa">f</span><spanclass="s2">"Module name: </span><spanclass="si">{</span><spanclass="n">m_name</span><spanclass="si">}</span><spanclass="s2">, module type: </span><spanclass="si">{</span><spanclass="nb">type</span><spanclass="p">(</span><spanclass="n">m</span><spanclass="p">)</span><spanclass="si">}</span><spanclass="s2">. No hook or functorch is added."</span>
<spanclass="c1"># When functorch is not enforced, use FGC (hook mode) if the layer has a registered grad_sampler (supported). Otherwise, use FGC (functorch mode).</span>
<spanclass="w"></span><spanclass="sd">"""Returns per sample gradient norms. Note that these are not privatized and should only be used for debugging purposes or in non-private settings"""</span>
<spanclass="sd"> Add logs to track gradient sample mode for each part of the module, including 1) Ghost Clipping, 2) Fast Gradient Clipping (hook mode), and 3) Fast Gradient Clipping (functorch mode).</span>
296
+
297
+
<spanclass="sd"> Args:</span>
298
+
<spanclass="sd"> module: nn.Module to be checked</span>
299
+
<spanclass="sd"> force_functorch: If set to ``True``, will use functorch to compute</span>
300
+
<spanclass="sd"> all per sample gradients. Otherwise, functorch will be used only</span>
301
+
<spanclass="sd"> for layers without registered grad sampler methods.</span>
302
+
<spanclass="sd"> use_ghost_clipping: If set to ``True``, Ghost Clipping</span>
303
+
<spanclass="sd"> will be used for clipping gradients of supported layers. If ``False``, Fast</span>
304
+
<spanclass="sd"> Gradient Clipping will be used for all layers.</span>
<spanclass="sa">f</span><spanclass="s2">"Module name: </span><spanclass="si">{</span><spanclass="n">m_name</span><spanclass="si">}</span><spanclass="s2">, module type: </span><spanclass="si">{</span><spanclass="nb">type</span><spanclass="p">(</span><spanclass="n">m</span><spanclass="p">)</span><spanclass="si">}</span><spanclass="s2">. No hook or functorch is added."</span>
<spanclass="c1"># When functorch is not enforced, use FGC (hook mode) if the layer has a registered grad_sampler (supported). Otherwise, use FGC (functorch mode).</span>
<spanclass="w"></span><spanclass="sd">"""Returns per sample gradient norms. Note that these are not privatized and should only be used for debugging purposes or in non-private settings"""</span>
0 commit comments