MetaFormer Baselines for Vision (TPAMI 2024)
-
Updated
Jun 1, 2024 - Python
MetaFormer Baselines for Vision (TPAMI 2024)
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
Auto-regressive causal language model for molecule (SMILES) and reaction template (SMARTS) generation based on the Hugging Face implementation of OpenAI's GPT-2 transformer decoder model
PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
Lumina-T2X is a unified framework for Text to Any Modality Generation
A typescript transformer that automatically generates validation code from your types.
A high-throughput and memory-efficient inference and serving engine for LLMs
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Transformer Balance Research
A PyTorch implementation of the vanilla Transformer.
Some natural language processing networks from scratch in PyTorch for personal educational purposes.
Transformer Architectures Comparison in Natural Language Generation Tasks
Basic implementation code for multimodal models and some applications or fine-tuning tasks based on them.
Large Language Model Text Generation Inference
Production First and Production Ready End-to-End Speech Recognition Toolkit
This repository contains machine learning and deep learning projects from beginner to advanced, using TensorFlow and scikit-learn and other dependencices.
Tevatron - A flexible toolkit for neural retrieval research and development.
A framework for few-shot evaluation of language models.
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."