Azure ML define transformer path from a data asset/datastore within the config.cfg file #13367
Unanswered
EY4L
asked this question in
Help: Coding & Implementations
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am attempting to train a transformer based model in Azure ML.
Currently each time a job is run the roberta-base model is being downloaded on each job run.
I wanted to find out if anyone has an approach to how to define paths in the config file so that a training job can either read the transformer weights from a datastore, a data asset or from the model registry.
Alternatively any other solutions within a cloud based training system would be appreciated if shared. Or what general approach could be taken to solve this issue.
Thank you very much.
Beta Was this translation helpful? Give feedback.
All reactions