Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Having 2 optimizers #14

Open
JoaoLages opened this issue Aug 22, 2018 · 3 comments
Open

Having 2 optimizers #14

JoaoLages opened this issue Aug 22, 2018 · 3 comments

Comments

@JoaoLages
Copy link

Hi there! Thank you for making this implementation open-source!
I have one question though: Although you have one backward step, you have 2 optimizers. shouldn't you combine both model's parameters and use only one optimizer?

@Sandeep42
Copy link
Contributor

In hindsight, I would have used a single optimiser using something like this.

optim.Adam(list(model1.parameters()) + list(model2.parameters())

At that time, I was new to PyTorch and didn't know this. You can go ahead and use 1 optimiser for a much cleaner code.

@JoaoLages
Copy link
Author

Thanks for your reply. That is what I am doing. Nevertheless, it seems that while using 2 optimizers the loss lowers way faster than comparing with one optimizer. what might be the reason for this?

Moreover, I have changed the optimizer to Adam but havent been able to get a BCE loss lower than ~0.255 for a multi-label classification problem. Any suggestions?

@JoaoLages
Copy link
Author

Nevermind, I had a typo, 2 optimizers vs 1 optimizers produces more or less the same it seems. Still having the loss problem though

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants