Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra tanh? #3

Open
dhzyingz opened this issue May 5, 2022 · 0 comments
Open

Extra tanh? #3

dhzyingz opened this issue May 5, 2022 · 0 comments

Comments

@dhzyingz
Copy link

dhzyingz commented May 5, 2022

In the backpropagation part, the first line code writes:

dtanh = softmaxOutput.diff(forward[len(forward)-1][2], y)

So it is activated then sent to softmax?
I guess for the last layer there is no need to add tanh before softmax? So the code would be:

dtanh = softmaxOutput.diff(forward[len(forward)-1][1], y)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant