Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Self-Attention Paragraph Typos #779

Open
PeppeSaccardi opened this issue May 5, 2021 · 2 comments
Open

Self-Attention Paragraph Typos #779

PeppeSaccardi opened this issue May 5, 2021 · 2 comments

Comments

@PeppeSaccardi
Copy link
Contributor

In the paragraph Self-Attention(I) of Week 12/Attention and the Transformer
there is a little mistake after the definition of the hidden layer formula as matrix multiplication: the vector formula should belong to formula instead of formula

@Atcold
Copy link
Owner

Atcold commented May 5, 2021

Of course, thanks!
Would you like to send a PR to fix this across all languages?

@PeppeSaccardi
Copy link
Contributor Author

Of course! Thank you Alfredo!

PeppeSaccardi added a commit to PeppeSaccardi/pytorch-Deep-Learning that referenced this issue May 6, 2021
Self-Attention Paragraph Typos: Issues Atcold#779
PeppeSaccardi added a commit to PeppeSaccardi/pytorch-Deep-Learning that referenced this issue May 6, 2021
Self-Attention Paragraph Typos: Issues Atcold#779
Atcold added a commit that referenced this issue May 7, 2021
* Update 12-3.md

Self-Attention Paragraph Typos: Issues #779

* Update 12-3.md

Correction in Spanish

* Update 12-3.md

* Update 12-3.md

* Update 12-3.md

* Update 12-3.md

* Update 12-3.md

* Update 12-3.md

* Update docs/es/week12/12-3.md

Co-authored-by: Alfredo Canziani <alfredo.canziani@gmail.com>

* Update French 12-3.md

* Update english comment 12-3.md

* Update korean 12-3.md

* Update Russian 12-3.md

* Update turkish 12-3.md

Co-authored-by: Alfredo Canziani <alfredo.canziani@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants