-
Notifications
You must be signed in to change notification settings - Fork 94
Open
Description
In some cases, such as the sequence length is short or the value of mask_prob is small, there will be a situation where the whole training sequence is not masked, and the loss at this time will be the value of nan, how to solve this situation? I don't want the loss to be a nan value, can I only adjust the value of prob?
Metadata
Metadata
Assignees
Labels
No labels