Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

动手学深度学习3.2节有误 #833

Open
skywalk163 opened this issue Jan 20, 2022 · 1 comment
Open

动手学深度学习3.2节有误 #833

skywalk163 opened this issue Jan 20, 2022 · 1 comment

Comments

@skywalk163
Copy link
Contributor

features, labels两个变量都没有,看上下文是用的paddle_features paddle_labels
报错:

NameError                                 Traceback (most recent call last)
/tmp/ipykernel_21564/2487356068.py in <module>
      4 loss = squared_loss
      5 for epoch in range(num_epochs):
----> 6     for paddle_X, paddle_y in paddle_data_iter(batch_size, features, labels):
      7         l = loss(net(paddle_X, paddle_w, paddle_b), paddle_y)  # `paddle_X`和`paddle_y`的小批量损失
      8         # 因为`l`形状是(`batch_size`, 1),而不是一个标量。`l`中的所有元素被加到一起,

NameError: name 'features' is not defined

代码:

lr = 0.03
num_epochs = 3
net = paddle_linreg
loss = squared_loss
for epoch in range(num_epochs):
    for paddle_X, paddle_y in paddle_data_iter(batch_size, features, labels):
        l = loss(net(paddle_X, paddle_w, paddle_b), paddle_y)  # `paddle_X`和`paddle_y`的小批量损失
        # 因为`l`形状是(`batch_size`, 1),而不是一个标量。`l`中的所有元素被加到一起,
        # 并以此计算关于[`paddle_w`, `paddle_b`]的梯度
        l.sum().backward()
        paddle_sgd([paddle_w, paddle_b], lr, batch_size)  # 使用参数的梯度更新参数
    with paddle.no_grad():
        paddle_train_l = loss(net(paddle_features, paddle_w, paddle_b), paddle_labels)
        print(f'epoch {epoch + 1}, loss {float(pddle_train_l.mean()):f}')

修改后会报错

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
/tmp/ipykernel_21564/4159795007.py in <module>
      8         # 因为`l`形状是(`batch_size`, 1),而不是一个标量。`l`中的所有元素被加到一起,
      9         # 并以此计算关于[`paddle_w`, `paddle_b`]的梯度
---> 10         l.sum().backward()
     11         paddle_sgd([paddle_w, paddle_b], lr, batch_size)  # 使用参数的梯度更新参数
     12     with paddle.no_grad():

AttributeError: 'numpy.float64' object has no attribute 'backward'

本身自己对这种从头开始写的情况不熟悉,想学习一下,现在跑不通。。。如果我懂,我能调通,但是我不懂,没法跑通,这样就学不会。。。陷入循环。

@skywalk163
Copy link
Contributor Author

def pddle_sgd(params, lr, batch_size):这里拼写错误 ,应该是paddle

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant