question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Pruning not working for tf.keras.Batchnorm

See original GitHub issue

Describe the bug ValueError: Please initialize Prune with a supported layer. Layers should either be a PrunableLayer instance, or should be supported by the PruneRegistry. You passed: <class ‘tensorflow.python.keras.layers.normalization.BatchNormalization’>

System information

TensorFlow installed from (source or binary): binary

TensorFlow version: 2.1.0

TensorFlow Model Optimization version: 0.2.1

Python version: 3.5.6

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:13 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
jkparuchuricommented, Apr 30, 2020

@Craftsman381 Workaround is to add "tf.compat.v1.keras.layers.BatchNormalization: []" to prune_registry at https://github.com/tensorflow/model-optimization/blob/master/tensorflow_model_optimization/python/core/sparsity/keras/prune_registry.py#L82 and rebuild from source

1reaction
jkparuchuricommented, Jan 19, 2020

@alanchiao Able to resolve the issue. Its “tf.compat.v1.keras.layers.BatchNormalization: []”. keras was missing earlier in that line.

Read more comments on GitHub >

github_iconTop Results From Across the Web

tf.keras.layers.BatchNormalization | TensorFlow v2.11.0
Batch normalization applies a transformation that maintains the mean output close to 0 and the output standard deviation close to 1.
Read more >
How to prune certain weights (rather than freeze a layer) with ...
So how to prune these connections (freeze their weights to 0 during training) in Keras? Do I need to writing my own layers...
Read more >
Model Pruning in Deep Neural Networks Using the ...
To address this problem, one common solution is to add regularization terms to the model. Another consists in reducing the complexity of the...
Read more >
The Batch Normalization layer of Keras is broken - Datumbox
The problem with the current implementation of Keras is that when a BN layer is frozen, it continues to use the mini-batch statistics...
Read more >
YOLOv4 — TAO Toolkit 4.0 documentation
Note: YOLOv4 does not support loading a pruned QAT model and retraining it with QAT ... The meanings of above parameters are same...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found