Skip to content

acforvs/transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer

PyTorch Poetry

Coming soon...

Unofficial code for the original paper Attention is all you need by Ashish Vaswani et al.

img.png

TODO

  • Sublayer Residual Connection
  • Encoder Layer
  • Encoder
  • Decoder Layer
  • Decoder
  • Multi-Head Attention
  • Position-wise Fully Connected Feed-Forward Network
  • Positional Encoding
  • Embedding
  • PyTorch Transformer
  • Training Data
  • Utils
  • Training Scripts
  • Train
  • JAX implementation

Acknowledgements

I was heavily inspired by

  • The original paper Attention Is All You Need by Ashish Vaswani et al., see here
  • OpenNMT: Open-Source Toolkit for Neural Machine Translation by Guillaume Klein et al., see here
  • Illustrated Attention by Raimi Karim, see here
  • pytorch-original-transformer by Aleksa Gordić, see here

License

License: MIT

Releases

No releases published

Packages

No packages published

Languages