FedML - The Research and Production Integrated Federated Learning Library: https://fedml.ai
-
Updated
Sep 3, 2022
FedML - The Research and Production Integrated Federated Learning Library: https://fedml.ai
Federated Optimization in Heterogeneous Networks (MLSys '20)
All materials you need for Federated Learning: blogs, videos, papers, and softwares, etc.
Fair Resource Allocation in Federated Learning (ICLR '20)
FedTorch is a generic repository for benchmarking different federated and distributed learning algorithms using PyTorch Distributed API.
Distributed Linear Programming Solver on top of Apache Spark
DISROPT: A Python framework for distributed optimization
This library is an implementation of the algorithm described in Distributed Trajectory Estimation with Privacy and Communication Constraints: a Two-Stage Distributed Gauss-Seidel Approach.
Implementation of (overlap) local SGD in Pytorch
FedDANE: A Federated Newton-Type Method (Asilomar Conference on Signals, Systems, and Computers ‘19)
Scalable, structured, dynamically-scheduled hyperparameter optimization.
Communication-efficient decentralized SGD (Pytorch)
A package for solving optimal power flow problems using distributed algorithms.
Implementation of Redundancy Infused SGD for faster distributed SGD.
Implementation of Local Updates Periodic Averaging (LUPA) SGD
We present a set of all-reduce compatible gradient compression algorithms which significantly reduce the communication overhead while maintaining the performance of vanilla SGD. We empirically evaluate the performance of the compression methods by training deep neural networks on the CIFAR10 dataset.
MATLAB implementation of the paper "Online Distributed Optimal Power Flow with Equality Constraints" [arXiv 2022].
Distributed Multidisciplinary Design Optimization
tvopt is a prototyping and benchmarking Python framework for time-varying (or online) optimization.
Error feedback based quantization aided and convergence guaranteed Communication Efficient Federated Linear and Deep GCCA
Add a description, image, and links to the distributed-optimization topic page so that developers can more easily learn about it.
To associate your repository with the distributed-optimization topic, visit your repo's landing page and select "manage topics."