Lipschitz network- In this lecture, something is mentioned in the context of estimating approximation capabilities.
- Networks representing continuous-time dynamic systems (Lecture 1, Slide 12)
- Addressing the lack of
robustnessin neural networks, meaning neural networks are not like Java, as they are not robust. Lipschitz continuityis a property of a function to not change too rapidly
A function f : Rᴹ → Rᴺ is Lipschitz continuous if there is a constant L such that
∥f(x) - f(y)∥ ≦ L ∥x - y∥ for every x, y.
Spectral normalizationis a method that allows achievingLipschitz continuityin neural networks.- Partial solution to the problem of
adversarial examples. - activation functions
- ReLU, Leaky ReLU, Softplus, Sigmoid, Tanh, ArcTan -
Lipschitz constant = 1, meaning that derivatives are bounded by 1 e^xdoes not satisfy the Lipschitz condition
- ReLU, Leaky ReLU, Softplus, Sigmoid, Tanh, ArcTan -
- Essentially, limiting the growth rate is limiting the derivative of the function (
gradient)
- Towards Data Science article
- OpenReview article
- Repository
- arXiv: Unified Algebraic Perspective
- NeurIPS: Regularity of Deep Neural Networks
- Another paper
- NVIDIA: Learning Smooth Neural Functions via Lipschitz Regularization
- Posts on StackExchange Artificial Intelligence 1 2 3
- Wikipedia Lipschitz continuity