Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The performance on CONLL03 is 77% when I changed to epochs to 15. #3

Open
manliu1225 opened this issue Jul 20, 2020 · 1 comment
Open

Comments

@manliu1225
Copy link

Hi, here is a issue that if I use the original parameters, the performance is quite low. After I increased the epochs to 15, the F1 of CONLL03 is only 77%.

@DanqingZ
Copy link

DanqingZ commented Aug 6, 2020

I found https://github.com/liuyukid/transformers-ner/blob/master/models/bert_ner.py#L110-L111

That the author change those labels with -100 to 0. And use attention mask as the mask for crf. However, this will add those tokens that are not the first token of the word, and all these tokens have their label = 0. I think it is noise.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants