Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about training epoch? #6

Open
gaow0007 opened this issue Apr 12, 2019 · 2 comments
Open

Question about training epoch? #6

gaow0007 opened this issue Apr 12, 2019 · 2 comments

Comments

@gaow0007
Copy link

I have a question about the result of epoch. Why do you use 600-2000 epoch to validate the superiority of your method? I think that the epoch number is too large and sometimes I only use 200 epoch to train these tiny dataset. Any reasons about settings of epoch?

Best

@alexmlamb
Copy link
Collaborator

We achieved better results when running for more epochs (for both manifold mixup and our baselines), but we definitely saw an improvement with manifold mixup over input mixup for 600 epochs.

I think it helps with an even smaller number of epochs (<600) but I don't recall if I've actually run that experiment.

@vikasverma1077
Copy link
Owner

Another way to think about this is that Manifold Mixup is a stronger regularizer and hence you need more training epochs to train with it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants