Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rotation #80

Open
adamas-v opened this issue Mar 18, 2022 · 3 comments
Open

Rotation #80

adamas-v opened this issue Mar 18, 2022 · 3 comments

Comments

@adamas-v
Copy link

Nice work, but I visualized the matching of Hpatches and superpoint failed in all rotation case.maybe rotation not include in training augmentation?

@ghost
Copy link

ghost commented Oct 1, 2022

the matching of Hpatches and SuperPoint failed in all rotation case
Me too. My bad case of the result was only two lines in point matching of Hpatches.
I don't know why. Probably, Some of the steps may be lacking in these codes.

@adamas-v
Copy link
Author

adamas-v commented Oct 1, 2022

the matching of Hpatches and SuperPoint failed in all rotation case
Me too. My bad case of the result was only two lines in point matching of Hpatches.
I don't know why. Probably, Some of the steps may be lacking in these codes.

that's beacuse the pre-train model was trained with little sample of rotation, so it is leak of ratotion attack .

@ghost
Copy link

ghost commented Oct 1, 2022

that's beacuse the pre-train model was trained with little sample of rotation, so it is leak of ratotion attack .

Oh, I see. To deal with problem, it will be better to increase the rotation samples to train the pre-trained model.
These codes didn't mention the step which is training MagicPoint with MS-COCO after Homographic Adaptation. These things may connect to this problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant