PyTorch implementation of GPT from scratch
-
Updated
May 31, 2024 - Jupyter Notebook
PyTorch implementation of GPT from scratch
Pytorch Transformer Implementation
Transformers without Tears: Improving the Normalization of Self-Attention
Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
[CVPR 2024] Official implementation of CVPR 2024 paper: "Inversion-Free Image Editing with Natural Language"
Sequence Parallel Attention for Long Context LLM Model Training and Inference
Project Name: AdaViT | PyTorch Lightning, Python
Centrale-NLP-Public-Ressources : This repository is about the NLP class 2023/2024
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Pytorch implementation of the models RT-1-X and RT-2-X from the paper: "Open X-Embodiment: Robotic Learning Datasets and RT-X Models"
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Implementation of Liquid Nets in Pytorch
Implementation of the model "AudioFlamingo" from the paper: "Audio Flamingo: A Novel Audio Language Model with Few-Shot Learning and Dialogue Abilities"
Implementation of the transformer from the paper: "Real-World Humanoid Locomotion with Reinforcement Learning"
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with Sparse Transformers"
My implementation of Kosmos2.5 from the paper: "KOSMOS-2.5: A Multimodal Literate Model"
Implementation of AutoRT: "AutoRT: Embodied Foundation Models for Large Scale Orchestration of Robotic Agents"
Implementation of the ScreenAI model from the paper: "A Vision-Language Model for UI and Infographics Understanding"
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"
Add a description, image, and links to the attention-is-all-you-need topic page so that developers can more easily learn about it.
To associate your repository with the attention-is-all-you-need topic, visit your repo's landing page and select "manage topics."