-
Notifications
You must be signed in to change notification settings - Fork 669
Open
Description
Hi,
(1) Critic loss curve which should go to 0 will be including gradient penalty or without it?
(2) What should be the behavior of gradient penalty(Decreasing towards 0 or something else)?
(3) The result will be the same if we do backward propagation of gradient penalty individual or with discriminator loss as below.
(i) gradient_penalty.backward(retain_graph=True) [ Individual ]
(ii) loss_D = (- loss_real + loss_fake) + gradient_penalty [ with discriminator loss ]
loss_D.backward()
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels