Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EMNLP-2019-SciBERT: A Pretrained Language Model for Scientific Text #361

Open
BrambleXu opened this issue Feb 1, 2023 · 0 comments
Open
Assignees
Labels
BERT(M) BERT Model Embedding Embedding/Pre-train Model/Task

Comments

@BrambleXu
Copy link
Owner

Summary:

将BERT应用到科学期刊领域的研究

Resource:

Paper information:

Notes:

Model Graph:

Result:

Thoughts:

想把BERT应用到某个领域的话,这篇文章基本就是模板了

Next Reading:

@BrambleXu BrambleXu added Embedding Embedding/Pre-train Model/Task BERT(M) BERT Model labels Feb 1, 2023
@BrambleXu BrambleXu self-assigned this Feb 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BERT(M) BERT Model Embedding Embedding/Pre-train Model/Task
Projects
None yet
Development

No branches or pull requests

1 participant