Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

README.md

Lipschitz Networks

  • Lipschitz network
  • In this lecture, something is mentioned in the context of estimating approximation capabilities.
  • Networks representing continuous-time dynamic systems (Lecture 1, Slide 12)

General

  • Addressing the lack of robustness in neural networks, meaning neural networks are not like Java, as they are not robust.
  • Lipschitz continuity is a property of a function to not change too rapidly
A function f : Rᴹ → Rᴺ is Lipschitz continuous if there is a constant L such that
∥f(x) - f(y)∥ ≦ L ∥x - y∥ for every x, y.
  • Spectral normalization is a method that allows achieving Lipschitz continuity in neural networks.
  • Partial solution to the problem of adversarial examples.
  • activation functions
    • ReLU, Leaky ReLU, Softplus, Sigmoid, Tanh, ArcTan - Lipschitz constant = 1, meaning that derivatives are bounded by 1
    • e^x does not satisfy the Lipschitz condition
  • Essentially, limiting the growth rate is limiting the derivative of the function (gradient)

Resources