Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use ELMo in Keras version? #227

Open
LucasJau opened this issue Dec 31, 2019 · 1 comment
Open

How to use ELMo in Keras version? #227

LucasJau opened this issue Dec 31, 2019 · 1 comment

Comments

@LucasJau
Copy link

LucasJau commented Dec 31, 2019

For some reason, I have to use ELMo in Keras platform. When I try to write a Keras Layer for ELMo, I notice that when the input of BidirectionalLanguageModel is keras.layer.Input, the whole program will be stuck, but if the input is tf.placeholder, it will go through the code successfully but Keras model doesn't allow a non-Input input layer. How can I fix this? Please help me.

The layer is implemented like:

`
class ElmoEmbeddingLayer(Layer):

def __init__(self, config, **kwargs):
     self.dimensions = 200
     self.options_file = config.options_file
     self.weights_file = config.weights_file
     self.token_embedding_file = config.token_embedding_file
     self.bilm = BidirectionalLanguageModel(
         self.options_file,
         self.weights_file,
         use_character_inputs=False,
         embedding_weight_file=self.token_embedding_file,
         max_batch_size=1024
     )
     super(ElmoEmbeddingLayer, self).__init__(**kwargs)

def build(self, input_shape):
    super(ElmoEmbeddingLayer, self).build(input_shape)

def call(self, x, mask=None):
    context_embeddings_op = self.bilm(x)
    elmo_embedding = weight_layers('elmo_output', context_embeddings_op, l2_coef=0.0)
    elmo_embedding = elmo_embedding['weighted_op']
    return elmo_embedding

def compute_mask(self, inputs, mask=None):
    return K.not_equal(inputs, 0)

def compute_output_shape(self, input_shape):
    return input_shape[0], input_shape[1], self.dimensions`

And when I use it I just:

`
elmo_model = ElmoEmbeddingLayer(self.data_config)

tmp = tf.placeholder(tf.int32, shape=(None, None))

char_ids = Input(batch_shape=(None, None), dtype='int32', name='input_ids')

elmo_embeddings = elmo_model(char_ids)

lstm_output_1 = Bidirectional(LSTM(units=_char_lstm_size, return_sequences=True))(elmo_embeddings)
`
By using "char_ids", it will be stuck; if I use "tmp", the model can't be formed for regarding non-Input layers as the input of model.

@pinesnow72
Copy link

Did you solve this issue? I am also finding how to make keras layer of elmo, but stuck like you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants