Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Binomial loss hyper-parameter alpha & beta #50

Open
DJacobJiang opened this issue Jul 13, 2020 · 4 comments
Open

About Binomial loss hyper-parameter alpha & beta #50

DJacobJiang opened this issue Jul 13, 2020 · 4 comments

Comments

@DJacobJiang
Copy link

How to select appropriate Alpha and Beta in Binomialloss for different dataset.Thx lot!

@DJacobJiang DJacobJiang changed the title About Binomial loss hyper-parameter alpha @ beta About Binomial loss hyper-parameter alpha & beta Jul 13, 2020
@DJacobJiang
Copy link
Author

DJacobJiang commented Jul 13, 2020

And i changed the backbone from BN-Inception to Resnet50&152 .The code is work but top R@1 just achieved 78.61 80.03 in car196.Could you please give me some Suggestions on adjusting the parameters or network structure?
Thank you so much!

@bnu-wangxun
Copy link
Owner

You can grid search the Alpha and Beta in Binomial loss for different datasets.
For small datasets as Car196, CUB200, the tricks like freeze BN, can really improve the performance a lot.
By the way, how many dimensions do you use for ResNet50&152?

@DJacobJiang
Copy link
Author

Thank you for your reply!
In the experiment, I used 256,512,1024 as the embedding size respectively.
512 dimension seems best, but r@1 is also limited to around 80.

@bnu-wangxun
Copy link
Owner

The code is built 3 years ago. And I have open source a new coder repo for MS loss and XBM:

repo-1: https://github.com/MalongTech/research-ms-loss

repo-2: https://github.com/MalongTech/research-xbm

You can reimplement Binomial loss following the MS loss in repo-1, and the result maybe better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants