You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/API.md
+27-2Lines changed: 27 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -383,9 +383,33 @@ auto new_loss = CreateNumDiffFunc1(x, original_loss);
383
383
*NOTE* `CreateNumDiffFunc1` is when using first order optimizers which use the gradient only and `CreateNumDiffFunc2` for
384
384
second or pseudo-second order methods, which use both gradient and Hessian.
385
385
386
-
### Lossesand Norms
387
-
You can play with different losses, robust norms and M-estimators, have a look at `losses.h`.
386
+
### Losses, Norms and Robust Norms
387
+
You can play with different losses, robust norms and M-estimators, have a look at the `loss` fold er.
388
388
389
+
All losses and other norms follow the signature: `Name(x, export_or_jacobian)` or `Name(x, threshold, export_or_jacobian)` for robust norms.
390
+
391
+
* `export_or_jacobian` is either skipped and the loss/norm will be the only output of the function.
392
+
* If it is `true`, the Jacobian of the loss will be returned as well as the loss.
393
+
* Finally, if it is a matrix/vector representing a forward Jacobian, then the second returned value will be the transformed Jacobian using the chain rule.
394
+
395
+
Here is an example of how to get a Huber norm.
396
+
397
+
```cpp
398
+
double robust_norm = Huber(y.squaredNorm(), 0.8);
399
+
```
400
+
401
+
Here is another example of how to call a Huber norm and recover the scale/Jacobian of it or the transformed jacobian.
0 commit comments