Skip to content

Critic loss curve #90

@CBD88

Description

@CBD88

Hi,
(1) Critic loss curve which should go to 0 will be including gradient penalty or without it?
(2) What should be the behavior of gradient penalty(Decreasing towards 0 or something else)?
(3) The result will be the same if we do backward propagation of gradient penalty individual or with discriminator loss as below.
(i) gradient_penalty.backward(retain_graph=True) [ Individual ]
(ii) loss_D = (- loss_real + loss_fake) + gradient_penalty [ with discriminator loss ]
loss_D.backward()

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions