question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

setWeights fails with Layer weight shape not compatible error

See original GitHub issue

TensorFlow.js version

1.2.2

Node version

v11.13.0

Describe the problem or feature request

Attempting to copy weights from one network to another fails with the following error: (node:16764) UnhandledPromiseRejectionWarning: Error: Layer weight shape 128 not compatible with provided weight shape 3,3,128,256

Code to reproduce the bug / link to feature request

Encountered this error while attempting to run the example snake-dqn app. Error is thrown from the following method in dqn.js:

export function copyWeights(destNetwork, srcNetwork) { destNetwork.setWeights(srcNetwork.getWeights()); }

Traced the error to tfjs-layers/src/engine/topology.js in the setWeights method.

It appears that the list of layers returned by getWeights() is in a different order than those returned by this.weights:

weights contains: 0 -> 3,3,2,128 conv2d_Conv2D1/kernel 1 -> 128 conv2d_Conv2D1/bias 2 -> 128 batch_normalization_BatchNormalization1/gamma 3 -> 128 batch_normalization_BatchNormalization1/beta 4 -> 3,3,128,256 conv2d_Conv2D2/kernel 5 -> 256 conv2d_Conv2D2/bias 6 -> 256 batch_normalization_BatchNormalization2/gamma 7 -> 256 batch_normalization_BatchNormalization2/beta 8 -> 3,3,256,256 conv2d_Conv2D3/kernel 9 -> 256 conv2d_Conv2D3/bias 10 -> 2304,100 dense_Dense1/kernel 11 -> 100 dense_Dense1/bias 12 -> 100,3 dense_Dense2/kernel 13 -> 3 dense_Dense2/bias 14 -> 128 batch_normalization_BatchNormalization1/moving_mean 15 -> 128 batch_normalization_BatchNormalization1/moving_variance 16 -> 256 batch_normalization_BatchNormalization2/moving_mean 17 -> 256 batch_normalization_BatchNormalization2/moving_variance

paramValues contains: 0 -> 3,3,2,128 conv2d_Conv2D4/kernel 1 -> 128 conv2d_Conv2D4/bias 2 -> 128 batch_normalization_BatchNormalization3/gamma 3 -> 128 batch_normalization_BatchNormalization3/beta 4 -> 128 batch_normalization_BatchNormalization3/moving_mean 5 -> 128 batch_normalization_BatchNormalization3/moving_variance 6 -> 3,3,128,256 conv2d_Conv2D5/kernel 7 -> 256 conv2d_Conv2D5/bias 8 -> 256 batch_normalization_BatchNormalization4/gamma 9 -> 256 batch_normalization_BatchNormalization4/beta 10 -> 256 batch_normalization_BatchNormalization4/moving_mean 11 -> 256 batch_normalization_BatchNormalization4/moving_variance 12 -> 3,3,256,256 conv2d_Conv2D6/kernel 13 -> 256 conv2d_Conv2D6/bias 14 -> 2304,100 dense_Dense3/kernel 15 -> 100 dense_Dense3/bias 16 -> 100,3 dense_Dense4/kernel 17 -> 3 dense_Dense4/bias

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
jkovacs-devcommented, Aug 12, 2019

I agree, but I’m not the maintainer. After some debugging, I traced it down to the fact that models which are marked as untrainable return their weights in a different order than trainable models. You can fix the example code by removing the call which marks the model as untrainable.

I was thinking of submitting a PR for the tfjs-examples repository so at least the example code might work OOTB.

On Aug 11, 2019, at 8:45 pm, Steven Weaver notifications@github.com wrote:

Hi! I’m not sure how this isn’t considered a bug. The example doesn’t run… and there is an entire chapter dedicated to it in the upcoming release of Deep Learning with JavaScript.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

1reaction
rthadurcommented, Jul 18, 2019

This question is better asked on StackOverflow since it is not a bug or feature request. There is also a larger community that reads questions there.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Layer weight shape (1, 1) not compatible with provided weight ...
I am compiling the keras model and using function model.set_weights to set the final training weights of lost model to new model. Here...
Read more >
Error: Layer weight shape (3, 3, 3, 64) not compatible with ...
When I tried to load VGG16 weights with this function : load_model_weights(model, weights_path):. f = h5py.File(weights_path).
Read more >
NVIDIA Deep Learning TensorRT Documentation
This NVIDIA TensorRT Developer Guide demonstrates how to use the C++ and Python APIs for implementing the most common deep learning layers.
Read more >
Releases (v2) - ngSkinTools.com
Fixed: layer parenting discards the weights on the parent, and it's not an undoable action. Fixed: toggling some UI options in “Set Weights” ......
Read more >
tf.keras.layers.Layer | TensorFlow v2.11.0
__init__() : Defines custom layer attributes, and creates layer weights that do not depend on input shapes, using add_weight() , or other state....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found