Skip to content

paganpasta/SharpenFocus-Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Sharpen Focus: Learning With Attention Separability and Consistency

A Pytorch implementation

- Unofficial repository

The current results don't indicate any significant improvement over the baseline. The implementation may have errors, please be cautious.

I'll be adding some visualisations to verify the attention maps.

Results

Dataset CIFAR-10 CIFAR-100 STL-10
Resnet-18 94.25 73.32 81.85
SFocus-18 94.16 71.30 82.77

Requirements

Pytorch 1.2

Usage

python main.py --dataset cifar10 --batch-size 128 --prefix run0 --epochs 350 --milestones 75 150 225 300

Acknowledgement

This work is built upon the following repositories:

  1. CBAM
  2. GAIN

Bibtex

Paper

@InProceedings{Wang_2019_ICCV,
author = {Wang, Lezi and Wu, Ziyan and Karanam, Srikrishna and Peng, Kuan-Chuan and Singh, Rajat Vikram and Liu, Bo and Metaxas, Dimitris N.},
title = {Sharpen Focus: Learning With Attention Separability and Consistency},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
} 

Code

@misc{sfocus,
  author = {Singh, Aditya},
  title = {Sharpen Focus: Learning With Attention Separability and Consistency},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/MacroMayhem/SharpenFocus-Pytorch}}
}

About

An unoffocial Pytorch-based implementation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages