Skip to content

练习6.2.2 在我们创建的Conv2D自动求导时,有什么错误消息? #96

Open
@SeeseeYep

Description

RuntimeError: grad can be implicitly created only for scalar outputs

"""
PyTorch的backward参数, 默认是标量参数。因此如果计算非张量的梯度输出,需要使用
一个张量作为梯度参数, 用于存储各个元素梯度参数
"""

torch.conv2d 默认输入为 4维 code as:

X = torch.randn(size=(8 ,8))
conv2d_layer = Conv2D(kernel_size=(2, 2))

get grtadients

set auto gradients for x

X.requires_grad = True
output = conv2d_layer(X)

compute gradients

print(output)

#运行报错

output.backward()

assess gradients

print("the gradients of weights :", conv2d_layer.weight.grad)

#解决方法

假设output是卷积操作的输出,它是一个矩阵

创建一个与output形状相同的张量,并将其所有元素设置为1

grad_output = torch.ones_like(output)

调用backward并传递grad_output作为参数

output.backward(grad_output)

现在,可以正确地评估权重和偏差的梯度

print("the gradients of weights :", conv2d_layer.weight.grad)
print("the gradients of bias :", conv2d_layer.bias.grad)

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions