Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature pyramid attention (FPA) modules #167

Open
daniel-j-h opened this issue May 30, 2019 · 0 comments
Open

Feature pyramid attention (FPA) modules #167

daniel-j-h opened this issue May 30, 2019 · 0 comments

Comments

@daniel-j-h
Copy link
Collaborator

We should look into the feature pyramid attention (FPA) module for pixel-precise attention for segmentation features extracted from our resnet encoder.

Pyramid Attention Network for Semantic Segmentation
https://arxiv.org/abs/1805.10180

fpa-0

from https://arxiv.org/abs/1805.10180 Figure 2

fpa-1

from https://arxiv.org/abs/1805.10180 Figure 3

The GAU modules could be interesting to try out instead of our simple decoder modules, too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant