-
Notifications
You must be signed in to change notification settings - Fork 7
Open
Description
Hi when caluclating
grad, = autograd.grad(
outputs=(fake_img * noise).sum(), inputs=latents, create_graph=True
)
I got this error:
RuntimeError: derivative for aten::_scaled_dot_product_flash_attention_backward is not implemented
Have you faced any errors for this?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels