A minimalist 45 minutes implementation of the transformer backbone (encoder, decoder)
-
Updated
Aug 17, 2023 - Python
A minimalist 45 minutes implementation of the transformer backbone (encoder, decoder)
This is a transformer made from scratch using PyTorch.
Developed a Transformer Network from scratch using Pytorch for English to Persian Machine Translation- Deep Neural Network Course Project
Solving the task of reordering words in a sentence using the original Transformer model, in Tensorflow.
Code implementation of computer vision models for practice based on pytorch and einops.
Replication of "Attention Is All You Need" (Vaswani et al. 2017)
Vision Transformers in PyTorch MNIST Handwritten Digit Recognition
Boxing with tensorflow .. Cheers!!
Pick sentence pairs, which, hopefully, would appear normal to human in the scenario of real world chit-chat, from lines of sentences.
🗄 A utility to transform file contents for easy upload / storage over cloud.
Improve the Attentive State-Space Model by transformer
GPT-2 style architecture for training language generators for specific tasks. [Production Ready]
Simple & basic practice of machine learning problems
PyTorch implementation for an MPI-based dot-product attention distributed implementation
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."