Replies: 1 comment
-
|
@krolaper all model parameters are dedicated to learning new data during training, which means previously learned knowledge will be forgotten. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Good afternoon. Tell me if I understand correctly that the model can be taught in parts. What I mean is: There is a set of data with 5 classes, taught them for example. After I see that 1 of the 5 classes is not finished. Can you try to teach him clean, removing extra material for learning, leaving only for this 1st class, while learning the same model (without losing the previously trained)?
And is it possible in the future to learn the model in the same way, increasing the number of classes, namely to decide to add to the list the 6th grade, and to add material with an announcement for this 6th grade, again learning the model without losing the old one?
Beta Was this translation helpful? Give feedback.
All reactions