question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Building custom model over the final embedding layer

See original GitHub issue

BERT supposedly generates 768 dimensional embeddings for tokens. I am trying to build a multi-class classification model on top of this. My assumption is that the output of layer Encoder-12-FeedForward-Norm of shape (None, [seq_length], 768) would give this embeddings. This is what I am trying :

model = load_trained_model_from_checkpoint(config_path, checkpoint_path, training=True, seq_len=seq_len)

new_out = Bidirectional(LSTM(50, return_sequences=True, 
                       dropout=0.1, 
                       recurrent_dropout=0.1))(model.layers[-9].output)
new_out = GlobalMaxPool1D()(new_out)
new_out = Dense(50, activation='relu')(new_out)
new_out = Dropout(0.1)(new_out)
new_out = Dense(6, activation='sigmoid')(new_out)

newModel = Model(model.inputs[:2], new_out)

I get the following error for new_out = GlobalMaxPool1D()(new_out) :

TypeError: Layer global_max_pooling1d_11 does not support masking, but was passed an input_mask: Tensor("Encoder-12-FeedForward-Add/All:0", shape=(?, 128), dtype=bool)

I am not sure how masking is involved if I am just using the output of the encoder.

The paper mentions that the output corresponding to just the first [CLS] token should be used for classification. On trying this :

new_out = Lambda(lambda x: x[:,0,:])(model.layers[-9].output)

the model trains (although with poor results).

How can the pre-loaded model be used for classification?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

3reactions
CyberZHGcommented, Mar 13, 2019

I forgot to return a None mask in MaskedGlobalMaxPool1D. I’ve fixed it and made a release.

2reactions
CyberZHGcommented, Mar 13, 2019

#7 Sentence Embedding

GlobalMaxPool1D doesn’t support masking. Following is a modification that suits this case:

https://github.com/CyberZHG/keras-bert/blob/b7ecdc34637435a849695b1b1a4ebdc4da842832/keras_bert/layers/pooling.py#L5-L21

I’ve added a demo for sentence embedding with pooling:

https://github.com/CyberZHG/keras-bert/blob/02c7eb20c56b02ed74226d056354658c247ab52c/demo/load_model/load_and_pool.py#L20-L39

Read more comments on GitHub >

github_iconTop Results From Across the Web

Understanding Embedding Layer in Keras - Medium
We can create a simple Keras model by just adding an embedding layer. In the above example, we are setting 10 as the...
Read more >
How to Use Word Embedding Layers for Deep Learning with ...
The output of the Embedding layer is a 2D vector with one embedding for each word in the input sequence of words (input...
Read more >
Making new Layers and Models via subclassing - TensorFlow
Making new Layers and Models via subclassing · On this page · Setup · The Layer class: the combination of state (weights) and...
Read more >
Training a Neural Network Embedding Layer with Keras
Create a model with a 2D embedding layer and train it. Visualise the embedding layer. Do the same for a 3D normalised embedding...
Read more >
How to build embedding layer in keras - python - Stack Overflow
Flatten the output of embedding layer: model.add(Flatten()) before the first dense layer; · Use a convolutional layer (would recommend that): ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found