Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get Embedding Layer #249

Open
karthikjetty opened this issue Feb 13, 2023 · 2 comments
Open

Get Embedding Layer #249

karthikjetty opened this issue Feb 13, 2023 · 2 comments

Comments

@karthikjetty
Copy link

Hi! I am currently attempting to implement the GEM model on my laptop. I am currently trying to get the embedding layer for the GEM model so that the embedding layer could be used for several databases with a varying number of tasks. I am currently having trouble writing the code to get the embedding layer of the model(not an inference of a model, but the step before). I was wondering if there was anywhere where this was implemented.

@fanxiaoyu0
Copy link

I previously used GEM to fine-tune the lipophilicity prediction task, and this is the GEM code I previously collated. https://github.com/fanxiaoyu0/GEM

@Noisyntrain
Copy link
Collaborator

Hi karthikjetty,
As for the embedding layer, you can get it from the model you have saved. To be more specific, you can first get the param dict by paddle.load(param_path), then you can fliter the params by keys containing 'embedding' string. Like this:
emb_param = [param[k] for k in param.keys() if 'embedding' in k]. Then you can do waht you want with the embedding layer. Hope this can be helpful to you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants