Skip to content

This is the PyTorch implementation of Double Attention Network, NIPS 2018

Notifications You must be signed in to change notification settings

nguyenvo09/Double-Attention-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Double-Attention-Network

This is the PyTorch implementation of A^2-Nets: Double Attention Networks, Y Chen et al NIPS 2018

It can be used as an additional block for building models. Right now, the output tensor has shape (B, c_n, H, W). One can re-construct the original shape (B, c, H, W) with a single line of code in PyTorch.

Layer architecture

  • Two attention steps alt text

About

This is the PyTorch implementation of Double Attention Network, NIPS 2018

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages