Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Support for minimum learning rate #671

Open
tmostak opened this issue Apr 19, 2024 · 1 comment
Open

[FEATURE] Support for minimum learning rate #671

tmostak opened this issue Apr 19, 2024 · 1 comment
Labels
type/feature Feature request

Comments

@tmostak
Copy link

tmostak commented Apr 19, 2024

馃殌 Feature

Support for specification of a minimum learning rate

Motivation

Often in the research literature minimum learning rates are set when fine-tuning a model using a cosine or linear schedule, preventing the learning rate from dropping to 0 by the end of training. It would be very helpful if this could be supported in the LLM Studio UX.

@tmostak tmostak added the type/feature Feature request label Apr 19, 2024
@pascal-pfeiffer
Copy link
Collaborator

There is actually an open issue for that at transformers: huggingface/transformers#28441
Once merged there, we can quickly integrate in H2O LLM Studio, as we are already using the transformers schedulers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/feature Feature request
Projects
None yet
Development

No branches or pull requests

2 participants