Skip to content

Efficient Second Order Online Learning

Ariel Faigon edited this page Jun 3, 2017 · 6 revisions

OjaNewton is a sketched variant of a second order online learning algorithm called Online Newton Step (ONS). It overcomes the quadratic running time of ONS by performing the Oja's updates to keep a small sketch of the covariance matrix of gradients in a sparse manner.

Example of using OjaNewton in VW:

vw --OjaNewton --sketch_size=10 --alpha_inverse=1.0 train.vw

Here sketch_size is the number of directions that we keep for the covariance matrix (default is 10) and alpha_inverse can be viewed as a learning rate (default is 1.0).

Clone this wiki locally