Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why Segment Embedding number only 3? #106

Open
UTimeStrange opened this issue Nov 21, 2023 · 1 comment
Open

Why Segment Embedding number only 3? #106

UTimeStrange opened this issue Nov 21, 2023 · 1 comment

Comments

@UTimeStrange
Copy link

UTimeStrange commented Nov 21, 2023

import torch.nn as nn
class SegmentEmbedding(nn.Embedding):  
        def __init__(self, embed_size=512):  
                   super().__init__(3, embed_size, padding_idx=0)  

This is the source code. First idx is padding, thus only 2 segment is supported. Why does Bert support 2 segments only?

@songyandong
Copy link

因为一次放入2个句子,需要区分哪些token属于第一个句子,哪些token输入第二个句子, 再加上padding整好三个.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants