Skip to content
#

distributed-optimization

Here are 43 public repositories matching this topic...

We present a set of all-reduce compatible gradient compression algorithms which significantly reduce the communication overhead while maintaining the performance of vanilla SGD. We empirically evaluate the performance of the compression methods by training deep neural networks on the CIFAR10 dataset.

  • Updated Nov 14, 2021
  • Python

Improve this page

Add a description, image, and links to the distributed-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distributed-optimization topic, visit your repo's landing page and select "manage topics."

Learn more