Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

acurracy in MSCOCO and CIFAR-10 #17

Open
tyler-tan opened this issue May 2, 2023 · 2 comments
Open

acurracy in MSCOCO and CIFAR-10 #17

tyler-tan opened this issue May 2, 2023 · 2 comments

Comments

@tyler-tan
Copy link

Hi, thank you for your great work on TSA.
I have a question I would like to ask, when I run the Varying-Way Five-Shot and Five-Way One-Shot scenarios with the URL parameter file you provided, the accuracy of the two data sets MSCOCO and CIFAR-10 is lower than your provided more than 5%, and the accuracy rate of other datasets is %1-%2 different from what you provided. Have you performed any other operations on the MSCOCO and CIFAR-10 datasets?
looking forward to your answer.

@WeiHongLee
Copy link
Collaborator

Hi,

thank you for your question. I think it may be because of the noise during meta-testing, e.g. the optimization and sampling can cause noise that would affect the results. Also, it can be because of google-research/meta-dataset#54 reported in the meta-dataset repo. I recommend you to use our code and re-run experiments for all methods. If you want to reproduce the results shown in the paper, you can set shuffle_buffer_size=0 in the reader file which was an google-research/meta-dataset#54 in the original meta-dataset and make sure you use shuffled datasets as mentioned in the issue.

In our paper, all methods are evaluated under the same setting (shuffle_buffer_size=0 with shuffled datasets) for 5-shots and 1-shot settings and the ranking would be the same though results can be slightly affected by setting shuffle_buffer_size=1000.

Best!

@tyler-tan
Copy link
Author

Hi,
Thank you for your answer, it helped me a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants