Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicating hyperparameters when training a FactorVAE #74

Open
SantiagoJN opened this issue Mar 9, 2023 · 0 comments
Open

Duplicating hyperparameters when training a FactorVAE #74

SantiagoJN opened this issue Mar 9, 2023 · 0 comments

Comments

@SantiagoJN
Copy link

Hi!

I've been playing a little bit with the code(congratulations for the work by the way 馃槃 ), and I've seen that when training a FactorVAE model, both the batch size and number of epochs are duplicated:

disentangling-vae/main.py

Lines 191 to 194 in f045219

if args.loss == "factor":
logger.info("FactorVae needs 2 batches per iteration. To replicate this behavior while being consistent, we double the batch size and the the number of epochs.")
args.batch_size *= 2
args.epochs *= 2

Does anybody know the reason behind this operation? I've reviewed the original paper but I couldn't find anything related to this.
Thanks for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant