Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

STN for only Attention Mechanism (Isotropic Scaling) #28

Open
akshath123 opened this issue Mar 26, 2020 · 0 comments
Open

STN for only Attention Mechanism (Isotropic Scaling) #28

akshath123 opened this issue Mar 26, 2020 · 0 comments

Comments

@akshath123
Copy link

I was searching for STN implementations in GitHub and came across yours. I have a few queries regarding STN implementation only for the Attention mechanism which have fixed isotropic scaling say 0.5 and the localization network predicts only translation parameters (tx and ty).

Queries:

  1. If you use the same Spatial Transformer module written by you will it work?
  2. Should I use the localization network to predict only two parameters (tx and ty)?
  3. I bring the theta of shape (2, ) to (2, 3) by following the steps below,
    a. [tx, ty] * [0, 0, 1] --> [[0, 0, tx], [0, 0, ty]]
    b. [[0, 0, tx], [0, 0, ty]] + [[0.5, 0, 0], [0, 0.5, 0]] --> [[0.5, 0, tx], [0, 0.5, ty]]
    will it still be differentiable?
    Followed by a spatial transformer network.
    Will this work?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant