@@ -109,20 +109,24 @@ point predictions with `predict(model, Point(), Xnew)`.
109109
110110# Warm restart options
111111
112- update(model, newdata, :epochs=>n, other_replacements ...)
112+ update(model, data, replacements ...)
113113
114- If `Δepochs = n - perceptron.epochs` is non-negative, then return an updated model, with
115- the weights and bias of the previously learned perceptron used as the starting state in
116- new gradient descent updates for `Δepochs` epochs, and using the provided `newdata`
117- instead of the previous training data. Any other hyperparaameter `replacements` are also
118- adopted. If `Δepochs` is negative or not specified, instead return `fit(learner,
119- newdata)`, where `learner=LearnAPI.clone(learner; epochs=n, replacements....)`.
114+ If `replacements` includes `:epochs=>n` and `Δepochs = n - perceptron.epochs` is
115+ non-negative, then return an updated model, with the weights and bias of the previously
116+ learned perceptron used as the starting state in new gradient descent updates for
117+ `Δepochs` epochs, and using the provided `data`. It is possible that `data` is different
118+ from previously used training data, but `update_observations` may be more appropriate in
119+ that case (see below). Any other hyperparameter `replacements` are also adopted. If no
120+ replacement for :epochs is specified, or `Δepochs` is negative, instead return
121+ `fit(learner, data)`, where `learner=LearnAPI.clone(learner; replacements....)`.
120122
121123 update_observations(model, newdata, replacements...)
122124
123125Return an updated model, with the weights and bias of the previously learned perceptron
124126used as the starting state in new gradient descent updates. Adopt any specified
125- hyperparameter `replacements` (properties of `LearnAPI.learner(model)`).
127+ hyperparameter `replacements` (properties of `LearnAPI.learner(model)`). Exactly `n` new
128+ epochs are applied, where `n = model.epochs` unless this explictly changed in
129+ `replacements`.
126130
127131"""
128132PerceptronClassifier (; epochs= 50 , optimiser= Optimisers. Adam (), rng= Random. default_rng ()) =
0 commit comments