-
As per title, I would like to know if there is a way for this as it would speed up training time. 🙏 |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
If you are talking about any universal pretrained models, the answer is no. Universal pretrained models aren't that useful, because they are bound to dictionaries, and you cannot modify most of the config keys, either. You can reuse your former checkpoints as your private pretrained models. But you cannot change the dictionary or add/remove most of its functionalities. To use it, increase What do you expect that pretrained models can do? |
Beta Was this translation helpful? Give feedback.
-
Finetuning will be supported in #108. It will soon be merged and will be formally released in v2.1.0. |
Beta Was this translation helpful? Give feedback.
Finetuning will be supported in #108. It will soon be merged and will be formally released in v2.1.0.