Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unequal dimensions between the new_accumulated_hidden and the matrix in mlp in classifier #29

Open
pengluyaoyao opened this issue Oct 27, 2020 · 2 comments

Comments

@pengluyaoyao
Copy link

When I run the pplm when both bow and discrim are on, ('technology', 'sentiment', respectively), new_accumulated_hidden.shape[1] = 765 but the emb_size in mlp is 1024, the dimensions are not consistent in matmul in pplm_classification_head, so I am getting

RuntimeError: size mismatch, m1: [1 x 768], m2: [1024 x 5] when calculating the loss for the pertubed text.

Please correct me if I miss something, thank you very much for your help

@dathath
Copy link
Contributor

dathath commented Nov 29, 2020

I suspect this might have to do with using a GPT-2 model of the wrong size....

@pengluyaoyao
Copy link
Author

So I may need to train a sentiment classifier with a compatible shape with GPT2. Thanks for your reply!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants