Pretrained_word_embeddings.py: inconsistency in `set_weights(weights)` vs Provided weight at layer "embedding_1"
See original GitHub issueI have downloaded the the raw code on pretrained_word_embeddings.py to try out. I can’t run it since the provided weight is not the same as the set weights, here is what I see, any way to fix this bug?
Using TensorFlow backend.
Indexing word vectors.
Found 400000 word vectors.
Processing text dataset
Found 19997 texts.
Found 214873 unique tokens.
Shape of data tensor: (19997, 1000)
Shape of label tensor: (19997, 20)
Preparing embedding matrix.
Training model.
Traceback (most recent call last):
File “[the path]/pretrained_word_embeddings.py”, line 126, in <module>
embedded_sequences = embedding_layer(sequence_input)
File “/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py”, line 543, in call
self.build(input_shapes[0])
File “/usr/local/lib/python3.5/dist-packages/keras/layers/embeddings.py”, line 101, in build
self.set_weights(self.initial_weights)
File “/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py”, line 966, in set_weights
str(weights)[:50] + ‘…’)
ValueError: You called set_weights(weights)
on layer “embedding_1” with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[ 0. , 0. , 0. , …
Issue Analytics
- State:
- Created 7 years ago
- Comments:14
Removing
trainable=False
from the Embedding layer and addingmodel.layers[1].trainable=False
before it’s compiled worked for me.Turns out my problem wasn’t specifically with 1.2.0 or 1.1.2. I was passing both
weights=
andtrainable=False
to an Embedding layer, and they are incompatible.