Skip to content

iwasa-kosui/keras-easy-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Keras Attention

With this project, you can easily use the GRU attention model on Keras.

Example

Document Classfication Example
More at https://github.com/KilledByNLP/keras-easy-attention/blob/master/train.py

from keras.layers import Input, Dense, Embedding
from keras.models import Model
from models import AttentionGRU

N_MAX_WORDS = 100
N_DICTOINARY = 10000
N_EMBED_DIM = 256
N_CLASSES = 3

x = Input((N_MAX_WORDS,))
e = Embedding(output_dim=N_EMBED_DIM,
              input_dim=N_DICTOINARY,
              input_length=N_MAX_WORDS,
              trainable=True)(x)
g = AttentionGRU(input_shape=(None, N_MAX_WORDS, N_EMBED_DIM))(e)
o = Dense(N_CLASSES, activation='softmax')(g)
model = Model(inputs=x, outputs=o)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages