Skip to content

mlxu995/multihead-LDSA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

multi-head LDSA

Reference

[1] Tay, Yi, et al. "Synthesizer: Rethinking Self-Attention in Transformer Models." arXiv preprint arXiv:2005.00743 (2020).

[2] Menglong Xu, Shengqiang Li, Xiao-Lei Zhang, “Transformer-based End-to-End Speech Recognition with Local Dense Synthesizer Attention” Proc. ICASSP 2021 : 5899-5903, DOI :10.1109/ICASSP39728.2021.9414353

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages