You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Support for specification of a minimum learning rate
Motivation
Often in the research literature minimum learning rates are set when fine-tuning a model using a cosine or linear schedule, preventing the learning rate from dropping to 0 by the end of training. It would be very helpful if this could be supported in the LLM Studio UX.
The text was updated successfully, but these errors were encountered:
There is actually an open issue for that at transformers: huggingface/transformers#28441
Once merged there, we can quickly integrate in H2O LLM Studio, as we are already using the transformers schedulers.
馃殌 Feature
Support for specification of a minimum learning rate
Motivation
Often in the research literature minimum learning rates are set when fine-tuning a model using a cosine or linear schedule, preventing the learning rate from dropping to 0 by the end of training. It would be very helpful if this could be supported in the LLM Studio UX.
The text was updated successfully, but these errors were encountered: