Skip to content

jingyuanz/keras-self-attention-layer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

keras-self-attention-layer

a simple implementation of self attention layer with the Frobenius norm penalty that produces flattened sentence embedding matrix for sentence representation learning tasks.

Based on the paper https://arxiv.org/abs/1703.03130

About

a simple implementation of self attention layer that outputs flattened sentence embedding matrix, with the Frobenius norm penalty

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages