Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

questions regarding dimensions oftokens #1

Open
WeiLi9811 opened this issue Jul 22, 2021 · 0 comments
Open

questions regarding dimensions oftokens #1

WeiLi9811 opened this issue Jul 22, 2021 · 0 comments

Comments

@WeiLi9811
Copy link

Hi, Jihye!
Thanks so much for opening your code!
But I kept facing such error after running the first 2 modules, here's my problem:
after running :
import torch
from torchtext.legacy import data

TEXT = data.Field(tokenize = 'spacy',
tokenizer_language = 'en_core_web_sm')

LABEL = data.LabelField()

TEXT.build_vocab(train_data,
vectors = "glove.6B.100d",
unk_init = torch.Tensor.normal_)

LABEL.build_vocab(train_data)

i got the errors as below:
RuntimeError: Vector for token b'u-21s' has 96 dimensions, but previously read vectors have 100 dimensions. All vectors must have the same number of dimensions.

Is there any possible solution for this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant