You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I am currently attempting to implement the GEM model on my laptop. I am currently trying to get the embedding layer for the GEM model so that the embedding layer could be used for several databases with a varying number of tasks. I am currently having trouble writing the code to get the embedding layer of the model(not an inference of a model, but the step before). I was wondering if there was anywhere where this was implemented.
The text was updated successfully, but these errors were encountered:
I previously used GEM to fine-tune the lipophilicity prediction task, and this is the GEM code I previously collated. https://github.com/fanxiaoyu0/GEM
Hi karthikjetty,
As for the embedding layer, you can get it from the model you have saved. To be more specific, you can first get the param dict by paddle.load(param_path), then you can fliter the params by keys containing 'embedding' string. Like this: emb_param = [param[k] for k in param.keys() if 'embedding' in k]. Then you can do waht you want with the embedding layer. Hope this can be helpful to you
Hi! I am currently attempting to implement the GEM model on my laptop. I am currently trying to get the embedding layer for the GEM model so that the embedding layer could be used for several databases with a varying number of tasks. I am currently having trouble writing the code to get the embedding layer of the model(not an inference of a model, but the step before). I was wondering if there was anywhere where this was implemented.
The text was updated successfully, but these errors were encountered: