Skip to content
/ CoMHE Public

Implementation for <Regularizing Neural Networks via Minimizing Hyperspherical Energy> in CVPR'20.

License

Notifications You must be signed in to change notification settings

rmlin/CoMHE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Regularizing Neural Networks via Minimizing Hyperspherical Energy

By Rongmei Lin, Weiyang Liu, Zhen Liu, Chen Feng, Zhiding Yu, James Rehg, Li Xiong, Le Song

License

CoMHE is released under the MIT License (refer to the LICENSE file for details).

Contents

  1. Introduction
  2. Citation
  3. Short Video Introduction
  4. Requirements
  5. Usage
  6. Results

Introduction

Inspired by the Thomson problem in physics where the distribution of multiple propelling electrons on a unit sphere can be modeled via minimizing some potential energy, hyperspherical energy minimization has demonstrated its potential in regularizing neural networks and improving their generalization power. See our previous work -- MHE for an in-depth introduction.

Here we propose the compressive minimum hyperspherical energy (CoMHE) as a more effective regularization for neural networks (compared to the original MHE). Specifically, CoMHE utilizes projection mappings to reduce the dimensionality of neurons and minimizes their hyperspherical energy. According to different designs for the projection mapping, we consider several well-performing variants. Welcome to try our CoMHE in your work!

Our CoMHE is accepted to CVPR 2020 and the full paper is available on arXiv and here.

Citation

If you find our work useful in your research, please consider to cite:

@InProceedings{Lin20CoMHE,
    title={Regularizing Neural Networks via Minimizing Hyperspherical Energy},
    author={Lin, Rongmei and Liu, Weiyang and Liu, Zhen and Feng, Chen and Yu, Zhiding 
     and Rehg, James M. and Xiong, Li and Song, Le},
    booktitle={CVPR},
    year={2020}
}

Short Video Introduction

We also provide a short video introduction to help interested readers quickly go over our work and understand the essence of CoMHE. Please click the following figure to watch the Youtube video.

DCNet_talk

Requirements

  1. Python 3.6
  2. TensorFlow 1.14.0

Usage

Part 1: Clone the repositary

git clone https://github.com/rmlin/CoMHE.git

Part 2: Download the official CIFAR-100 training and testing data (python version)

wget https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz

Part 3: Train and test with the following code in different folder.

# run random projection CoMHE
cd random_projection
python train.py
# run angle-preserving projection CoMHE
cd angle_projection
python train.py
# run adversarial projection CoMHE
cd adversarial_projection
python train.py

If you want to change the hyperparameter settings of CoMHE (e.g., Random projection CoMHE), please refer to the file train.py for different input arguments such as dimension and number of projections.

Results

The expected training and testing dynamics (loss and accuracy) can be found in the corresponding log folder.

  • Random projection CoMHE: log
  • Angle-preserving projection CoMHE: log
  • Adversarial projection CoMHE: log

Contact

About

Implementation for <Regularizing Neural Networks via Minimizing Hyperspherical Energy> in CVPR'20.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages