Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ICLR-2020-StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding #360

Open
BrambleXu opened this issue Feb 1, 2023 · 0 comments
Assignees
Labels
BERT(M) BERT Model Embedding Embedding/Pre-train Model/Task

Comments

@BrambleXu
Copy link
Owner

Summary:

给BERT的预训练又添加了两个额外的任务,一个是句子内用于学习单词顺序的任务,一个是不同句子间用于学习句子间顺序的任务。

Resource:

  • pdf
  • [code](
  • [paper-with-code](

Paper information:

  • Author:
  • Dataset:
  • keywords:

Notes:

Model Graph:

Result:

Thoughts:

从设计上来说,这个模型对于相似度的贡献应该比较小

Next Reading:

@BrambleXu BrambleXu added Embedding Embedding/Pre-train Model/Task BERT(M) BERT Model labels Feb 1, 2023
@BrambleXu BrambleXu self-assigned this Feb 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BERT(M) BERT Model Embedding Embedding/Pre-train Model/Task
Projects
None yet
Development

No branches or pull requests

1 participant