A flexible, modular, and easy to use library to facilitate federated learning research and development in healthcare settings
-
Updated
Jun 3, 2024 - Python
A flexible, modular, and easy to use library to facilitate federated learning research and development in healthcare settings
FedStream: Prototype-Based Federated Learning on Distributed Concept-drifting Data Streams
CycleSL: Server-Client Cyclical Update Driven Scalable Split Learning
An open framework for Federated Learning.
Extremely Randomized Trees with Privacy Preservation for Distributed Data (k-PPD-ERT)
Docker CLI package for the vantage6 infrastructure
subMFL: Compatible subModel Generation for Federated Learning in Device Heterogeneous Environment
[TMLR] CoDeC: Communication-Efficient Decentralized Continual Learning
Distributed machine learning using processes
A Federated Learning based Android Malware Classification System
Sparse Convex Optimization Toolkit (SCOT)
Federated Learning Utilities and Tools for Experimentation
Dist-DGL running on wsl2, minikube with single machine
FedGraphNN: A Federated Learning Platform for Graph Neural Networks with MLOps Support. The previous research version is accepted to ICLR'2021 - DPML and MLSys'21 - GNNSys workshops.
This repository is the code basis of the paper intitled "The learning costs of Federated Learning in constrained scenarios"
A script for training the ConvNextV2 on CIFAR10 dataset using the FSDP technique for a distributed training scheme.
The repository focuses on conducting Federated Learning experiments using the Intel OpenFL framework with diverse machine learning models, utilizing image and tabular datasets, applicable different domains like medicine, banking etc.
CD-GraB is a distributed gradient balancing framework that aims to find distributed data permutation with provably better convergence guarantees than Distributed Random Reshuffling (D-RR). https://arxiv.org/pdf/2302.00845.pdf.
Collaborative Machine Learning approach to train a mode that classifies a person as smoker or non-smoker based on the user data. The distributed approach of training is done with secure model transmissions to central cloud location where Amazon EC2 instance aggregates the new model based on new training received in Homomorphically Encrypted forms
Add a description, image, and links to the distributed-learning topic page so that developers can more easily learn about it.
To associate your repository with the distributed-learning topic, visit your repo's landing page and select "manage topics."