A radically simple, reliable, and high performance template to enable you to quickly get set up building multi-agent applications
-
Updated
Jun 6, 2024 - Shell
A radically simple, reliable, and high performance template to enable you to quickly get set up building multi-agent applications
A PyTorch-based Speech Toolkit
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Unify Efficient Fine-Tuning of 100+ LLMs
Fine-tuning Large Language Models (LLMs) for Text Classification Task
Train transformer-based models.
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Practical course about Large Language Models.
Open language modeling toolkit based on PyTorch
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Implementation of the paper: "Audio Mamba: Bidirectional State Space Model for Audio Representation Learning" in pytorch
Official Code Repo for the paper "Learning to Play Atari in a World of Tokens" accepted at ICML, 2024
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Implementation of Alphafold 3 in Pytorch
Lumina-T2X is a unified framework for Text to Any Modality Generation
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Accelerate your training with this open-source library. Optimize performance with streamlined training and serving options with JAX. 🚀
music generation with masked transformers!
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."