question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Pretrained_word_embeddings.py: inconsistency in `set_weights(weights)` vs Provided weight at layer "embedding_1"

See original GitHub issue

I have downloaded the the raw code on pretrained_word_embeddings.py to try out. I can’t run it since the provided weight is not the same as the set weights, here is what I see, any way to fix this bug?

Using TensorFlow backend. Indexing word vectors. Found 400000 word vectors. Processing text dataset Found 19997 texts. Found 214873 unique tokens. Shape of data tensor: (19997, 1000) Shape of label tensor: (19997, 20) Preparing embedding matrix. Training model. Traceback (most recent call last): File “[the path]/pretrained_word_embeddings.py”, line 126, in <module> embedded_sequences = embedding_layer(sequence_input) File “/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py”, line 543, in call self.build(input_shapes[0]) File “/usr/local/lib/python3.5/dist-packages/keras/layers/embeddings.py”, line 101, in build self.set_weights(self.initial_weights) File “/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py”, line 966, in set_weights str(weights)[:50] + ‘…’) ValueError: You called set_weights(weights) on layer “embedding_1” with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[ 0. , 0. , 0. , …

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:14

github_iconTop GitHub Comments

19reactions
SiWorgancommented, Jan 3, 2017

Removing trainable=False from the Embedding layer and adding model.layers[1].trainable=False before it’s compiled worked for me.

2reactions
danielherscommented, Dec 30, 2016

Turns out my problem wasn’t specifically with 1.2.0 or 1.1.2. I was passing both weights= and trainable=False to an Embedding layer, and they are incompatible.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Developers - Pretrained_word_embeddings.py: inconsistency in ...
I have downloaded the the raw code on pretrained_word_embeddings.py to try out. I can't run it since the provided weight is not the...
Read more >
Keras embedding layer set_weights() error on google colab
ValueError : You called set_weights(weights) on layer "embedding" with a weight list of length 500, but the layer was expecting 1 weights.
Read more >
Layer weight initializers - Keras
Initializers define the way to set the initial random weights of Keras layers. ... Mean of the random values to generate. stddev: a...
Read more >
Transfer learning and fine-tuning | TensorFlow Core
Freezing layers: understanding the trainable attribute. Layers & models have three weight attributes: weights is the list of all weights ...
Read more >
Weight labels and features—ArcGIS Pro | Documentation
Label weights and feature weights are set on the Weight Ranking dialog box. ... being placed on a given layer of point or...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found