Is there a way to put Elmo as a Keras layer and integrate it into a Keras model?
See original GitHub issueI used my own corpus to trained Elmo provided here. I wonder if there is a way to put Elmo as a Keras layer and integrate it into a Keras model. If yes, could you please provide an example just like usage_character.py
? Thank you very much.
Issue Analytics
- State:
- Created 5 years ago
- Comments:16
Top Results From Across the Web
Elmo Embeddings in Keras with TensorFlow hub
With a few fixes, it's easy to integrate a Tensorflow hub model with Keras! ELMo embeddings, developed at Allen NLP, are one of...
Read more >Using ELMo with Keras, how to correctly input the training set ...
Using this Elmo, the embedding is integrated as a layer following the input layer, so the input layer is actually a string.
Read more >How to Use ELMo Word Vectors for Spam Classification
... a tutorial on how to use TensorFlow Hub to get the ELMo word vectors module into Keras. This an example of how...
Read more >ELMo Embeddings in Keras - Sujay S Kumar
In this blog post, I will be demonstrating how to use ELMo Embeddings in Keras. Pre-trained ELMo Embeddings are freely available as a ......
Read more >Word embeddings | Text - TensorFlow
You will train your own word embeddings using a simple Keras model for a sentiment classification task, and then visualize them in the...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I added some examples here: https://github.com/UKPLab/elmo-bilstm-cnn-crf https://github.com/UKPLab/elmo-bilstm-cnn-crf/blob/master/Keras_ELMo_Tutorial.ipynb
There I used ELMo to train a Keras network (for sentence classification and sequence tagging).
In my first experiments I had the ELMo layer as a trainable embedding layer. However, this produces computationally quite a large overhead, because the ELMo embeddings must be computed each time. Computing the embeddings once and fixing them gave a big performance boost (at training time) without reducing the performance.
Hi - just a small note that I’ve updated the example in https://github.com/strongio/keras-elmo/blob/master/Elmo Keras.ipynb to allow trainable weights. Hope that helps 😃