🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 11, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A high-performance inference system for large language models, designed for production environments.
A high-throughput and memory-efficient inference and serving engine for LLMs
Popular Large Language Models from scratch - 2024
Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries
🔧 A Kotlin coroutine wrapper around Media3's Transformer API.
AICI: Prompts as (Wasm) Programs
Meme search engine and recommendation system using OpenAI CLIP and Apple MLX
JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
Implementation of NLP transformer with Automatic mixed precision and a few optimizations out of the box
An official implementation of "Periodicity Decoupling Framework for Long-term Series Forecasting" (ICLR 2024)
Lumina-T2X is a unified framework for Text to Any Modality Generation
Large Language Model Text Generation Inference
A framework for few-shot evaluation of language models.
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Research and Materials on Hardware implementation of Transformer Model
Kolmogorov Arnold Network Gated State Spaces Language Model
Code from the paper "A Multimodal French Corpus of Aligned Speech, Text, and Pictogram Sequences for Speech-to-Pictogram Machine Translation" (LREC-Coling 2024)
Transformer architecture conditioned by metadata
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."