Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

transformers中使用clue/roberta_chinese_pair_tiny的疑问 #23

Open
zy614582280 opened this issue Nov 30, 2021 · 1 comment
Open

transformers中使用clue/roberta_chinese_pair_tiny的疑问 #23

zy614582280 opened this issue Nov 30, 2021 · 1 comment

Comments

@zy614582280
Copy link

我通过transformers使用roberta_chinese_pair_tiny,提示以下warning

  1. You are using a model of type roberta to instantiate a model of type bert. This is not supported for all configurations of models and can yield errors.
  2. Some weights of the model checkpoint at clue/roberta_chinese_pair_tiny were not used when initializing BertModel: ['cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.bias', 'cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias', 'cls.predictions.decoder.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.LayerNorm.weight']

我的代码如下:

from transformers import BertModel
bert = BertModel.from_pretrained("clue/roberta_chinese_pair_tiny")

@brightmart
Copy link
Member

warning是不是可以认为是只是提示啊。反正最后一层相关的参数,你是需要自己训练过的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants