DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
Updated
May 24, 2024 - Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Surrogate Modeling Toolbox
Repository for our paper "See More Details: Efficient Image Super-Resolution by Experts Mining"
Early release of the official implementation for "GraphMETRO: Mitigating Complex Graph Distribution Shifts via Mixture of Aligned Experts"
Simplified Implementation of SOTA Deep Learning Papers in Pytorch
A curated reading list of research in Adaptive Computation, Dynamic Compute & Mixture of Experts (MoE).
Fast Inference of MoE Models with CPU-GPU Orchestration
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
A library for easily merging multiple LLM experts, and efficiently train the merged LLM.
[SIGIR'24] The official implementation code of MOELoRA.
Efficient global optimization toolbox in Rust: bayesian optimization, mixture of gaussian processes, sampling methods
Mixture-of-Experts for Large Vision-Language Models
[ICML 2024] "MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts"
Tutel MoE: An Optimized Mixture-of-Experts Implementation
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
PyTorch library for cost-effective, fast and easy serving of MoE models.
The implementation of "Leeroo Orchestrator: Elevating LLMs Performance Through Model Integration"
pytorch implementation of grok
Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."