Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding the LBFGS optimizer from PyTorch #79

Open
e-eight opened this issue Jun 9, 2021 · 12 comments · May be fixed by #81
Open

Adding the LBFGS optimizer from PyTorch #79

e-eight opened this issue Jun 9, 2021 · 12 comments · May be fixed by #81
Labels
new feature Feature request to work on

Comments

@e-eight
Copy link

e-eight commented Jun 9, 2021

Hi,

I am trying to use the BaggingRegressor model, with shallow estimators, on a small dataset, for which the LBFGS optimizer usually gives good results with a single estimator. However I see that the LBFGS optimizer in PyTorch is not included in the accepted list of optimizers for torchensemble. Will it be possible to add the LBFGS optimizer to the accepted list of optimizers, or is there any way that I can use the LBFGS optimizer with torchensemble for my work?

Thanks

@xuyxu
Copy link
Member

xuyxu commented Jun 10, 2021

Hi @e-eight, thanks for reporting! Could you provide me with an example on how to use the LBFGS optimizer, for example:

for batch_idx, (data, target) in enumerate(dataloader):
    # Here is your code

According to this introduction, it looks like using the LBFGS optimizer is different from other optimizers.

@xuyxu xuyxu added the new feature Feature request to work on label Jun 10, 2021
@e-eight
Copy link
Author

e-eight commented Jun 10, 2021

You can try it this way:

for batch_idx, (data, target) in enumerate(dataloader):

    # Code for sampling with replacement for bagging, or the corresponding code
    # for other models.

    # Optimization
    def closure():
        if torch.is_grad_enabled():
            optimizer.zero_grad()
        sampling_output = estimator(*sampling_data)
        loss = criterion(sampling_output, sampling_target)
        if loss.requires_grad:
            loss.backward()
        return loss

    optimizer.step(closure)

    # If you want to calculate the loss for monitoring:
    sampling_output = estimator(*sampling_data)
    loss = closure() # You can use this however it is preferred.

This way of optimizing should work with both LBFGS and other optimizers, such as Adam, at least it has worked for me with single estimators. You might find more details on the LBFGS optimizer here.

@xuyxu
Copy link
Member

xuyxu commented Jun 10, 2021

Thanks for your explanation. After reading the introduction, I think there should be no problem on supporting the LBFGS optimizer, wondering that if you are interested in working on this feature request ;-)

@e-eight
Copy link
Author

e-eight commented Jun 10, 2021

Sure, I will be happy to work on it!! I will get started on it then, and comment here if I face any problems.

@xuyxu
Copy link
Member

xuyxu commented Jun 10, 2021

Glad to hear that 😄. Here are some instructions on what to do next:

  • Add your contribution in CHANGELOG.rst
  • Add the LBFGS optimizer to set_optimizer method in torchensemble/utils/set_module.py
  • Update the docstrings __set_optimizer_doc in torchensemble/_constants.py
  • Try to modify the training lop in torchensemble/fusion.py to see if the LBFGS optimizer and other optimizer both work as expected. After then, we could modify other ensembles similarly.

Feel free to ask me anything in this issue or your pull request.

@xuyxu
Copy link
Member

xuyxu commented Jun 10, 2021

@all-contributors please add @e-eight for code

@allcontributors
Copy link
Contributor

@xuyxu

I've put up a pull request to add @e-eight! 🎉

@xuyxu
Copy link
Member

xuyxu commented Jun 10, 2021

@e-eight it would be better to open a PR on your own.

@e-eight
Copy link
Author

e-eight commented Jun 12, 2021

I have written the code, but I am not sure what is the best way to test it. I was thinking about doing the Year Prediction example in the examples folder, but with the LBFGS optimizer. Do you have any suggestions? Thanks!

@xuyxu
Copy link
Member

xuyxu commented Jun 13, 2021

Hi @e-eight, I am not sure if I understand your problem correctly. Perhaps you could open a pull request based on your current code, and we can then have a discussion there. For now, there is no need to pass all checks, simply upload your code, so that I can take a look and better understand your problem ;-)

@e-eight
Copy link
Author

e-eight commented Jun 15, 2021

Added pull request #81.

@xuyxu
Copy link
Member

xuyxu commented Jun 15, 2021

Thanks @e-eight for your PR. Kind of busy in two days. I will get back to you soon.

@xuyxu xuyxu linked a pull request Jun 22, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature Feature request to work on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants