🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Jun 1, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Transformer Balance Research
A PyTorch implementation of the vanilla Transformer.
Some natural language processing networks from scratch in PyTorch for personal educational purposes.
Transformer Architectures Comparison in Natural Language Generation Tasks
A high-throughput and memory-efficient inference and serving engine for LLMs
Basic implementation code for multimodal models and some applications or fine-tuning tasks based on them.
Large Language Model Text Generation Inference
Production First and Production Ready End-to-End Speech Recognition Toolkit
This repository contains machine learning and deep learning projects from beginner to advanced, using TensorFlow and scikit-learn and other dependencices.
Tevatron - A flexible toolkit for neural retrieval research and development.
A framework for few-shot evaluation of language models.
JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
Implementation of various transformer architecture models, application, and fine-tuning code.
a bro who codes with you
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
🔧 A Kotlin coroutine wrapper around Media3's Transformer API.
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."