You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# If multiple-optimizer setup with grad accumulation and without custom optimize method raise an error
533
+
if (
534
+
self.grad_accum_steps!=1
535
+
andisinstance(self.optimizer, list)
536
+
andnotisimplemented(self.model, "optimize")
537
+
):
538
+
raiseValueError(
539
+
" [!] Coqui Trainer does not support grad_accum_steps for multiple-optimizer setup, please set grad_accum_steps to 1 or implement in your model a custom method called ´optimize` that need to deal with dangling gradients in multiple-optimizer setup!"
" [!] Target loss not found in the keep_avg_target. You might be exiting the training loop before it is computed or set the target_loss in the model config incorrectly."
2114
-
)
2115
-
returntarget_loss
2129
+
2130
+
raiseValueError(
2131
+
" [!] Target loss not found in the keep_avg_target. You might be exiting the training loop before it is computed or set the target_loss in the model config incorrectly."
2132
+
)
2116
2133
2117
2134
# take the average of loss_{optimizer_idx} as the target loss when there are multiple optimizers
0 commit comments