-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TransformerEncoderLayer #36
Comments
Hi, |
thx,what'S the meaning of [attn:1:32:4, dynamic:default:32:4]?could you show some details about the list |
As I mentioned in my last reply,
You can find more details on these two params at the get_layer method in TransformerEncoderLayer module.lite-transformer/fairseq/models/transformer_multibranch_v2.py Lines 617 to 645 in de9631c
Find more details about MultiheadAttention module at lite-transformer/fairseq/modules/multihead_attention.py Lines 15 to 27 in de9631c
|
thx a lot!!! for layer_type in args.encoder_branch_type: the above code appear in the encoderlayer class,as you said,args.encoder_branch_type ==[attn:1:160:4, lightweight:default:160:4],but it lead to some errors,how to comprehend it???? |
hell,in the file of transformer-multibranch-v2,the class of TransformerEncoderLayer--the code are as follow:
if args.encoder_branch_type is None:#default=None????
self.self_attn = MultiheadAttention(
self.embed_dim, args.encoder_attention_heads,
dropout=args.attention_dropout, self_attention=True,
)
else:
layers = []
embed_dims = []
heads = []
num_types = len(args.
I just wonder that do the args.encoder_branch_type equalstrue???
The text was updated successfully, but these errors were encountered: