Skip to content

Commit 5c212c3

Browse files
committed
improve docstring for (non-included) PerceptronClassifier learner
1 parent 5e722f0 commit 5c212c3

File tree

1 file changed

+12
-8
lines changed

1 file changed

+12
-8
lines changed

src/learners/gradient_descent.jl

Lines changed: 12 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -109,20 +109,24 @@ point predictions with `predict(model, Point(), Xnew)`.
109109
110110
# Warm restart options
111111
112-
update(model, newdata, :epochs=>n, other_replacements...)
112+
update(model, data, replacements...)
113113
114-
If `Δepochs = n - perceptron.epochs` is non-negative, then return an updated model, with
115-
the weights and bias of the previously learned perceptron used as the starting state in
116-
new gradient descent updates for `Δepochs` epochs, and using the provided `newdata`
117-
instead of the previous training data. Any other hyperparaameter `replacements` are also
118-
adopted. If `Δepochs` is negative or not specified, instead return `fit(learner,
119-
newdata)`, where `learner=LearnAPI.clone(learner; epochs=n, replacements....)`.
114+
If `replacements` includes `:epochs=>n` and `Δepochs = n - perceptron.epochs` is
115+
non-negative, then return an updated model, with the weights and bias of the previously
116+
learned perceptron used as the starting state in new gradient descent updates for
117+
`Δepochs` epochs, and using the provided `data`. It is possible that `data` is different
118+
from previously used training data, but `update_observations` may be more appropriate in
119+
that case (see below). Any other hyperparameter `replacements` are also adopted. If no
120+
replacement for :epochs is specified, or `Δepochs` is negative, instead return
121+
`fit(learner, data)`, where `learner=LearnAPI.clone(learner; replacements....)`.
120122
121123
update_observations(model, newdata, replacements...)
122124
123125
Return an updated model, with the weights and bias of the previously learned perceptron
124126
used as the starting state in new gradient descent updates. Adopt any specified
125-
hyperparameter `replacements` (properties of `LearnAPI.learner(model)`).
127+
hyperparameter `replacements` (properties of `LearnAPI.learner(model)`). Exactly `n` new
128+
epochs are applied, where `n = model.epochs` unless this explictly changed in
129+
`replacements`.
126130
127131
"""
128132
PerceptronClassifier(; epochs=50, optimiser=Optimisers.Adam(), rng=Random.default_rng()) =

0 commit comments

Comments
 (0)