Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug found #8

Open
tszdanger opened this issue Feb 27, 2020 · 1 comment
Open

bug found #8

tszdanger opened this issue Feb 27, 2020 · 1 comment

Comments

@tszdanger
Copy link

in "手写数字识别器_minst_convnet.ipynb"
in def forward()
x = F.log_softmax(x, dim = 0) #输出层为log_softmax,即概率对数值log(p(x))。采用log_softmax可以使得后面的交叉熵计算更快
here the dim should be 1 because we see dim=0 as per data line

@tszdanger
Copy link
Author

And I also feel that in the same project
We use CrossEntropyLoss but the last layer of our network is log_softmax().
This may be not right

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant