Are all keras layers supported for weight pruning?
See original GitHub issueHi, thanks for a very convenient package!
Are the layers like LSTM, ConvLSTM2D and TimeDistributed supported for weight pruning? In the other issue, someone had a problem with TimeDistributed layer.
If they are not supported out of the box, would inheriting from PrunableLayer and implementing get_prunable_weights() (perhaps as return empty list) fix the issue?
Thanks!
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Pruning comprehensive guide - Model optimization
Prune whole model (Sequential and Functional); Prune some layers ... Welcome to the comprehensive guide for Keras weight pruning.
Read more >Weight Pruning with Keras
In this blog, we will be understanding the concept of weight pruning with Keras. Basically, weight pruning is a model optimization technique. ...
Read more >How to prune certain weights (rather than freeze a layer) ...
So how to prune these connections (freeze their weights to 0 during training) in Keras? Do I need to writing my own layers...
Read more >Keras layers API
Layers are the basic building blocks of neural networks in Keras. ... call method) and some state, held in TensorFlow variables (the layer's...
Read more >Making Neural Networks Smaller for Better Deployment
This is important to take into account as most deep learning frameworks, including Keras, don't support sparse weight layers.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

HI Anuar,
We should be supporting all builtin-Keras layers out-of-the-box, though in the implementation, we need to add them one-by-one (see this). Keras wrappers are a bit different, and in as you can see in the other issue, they aren’t supported yet (BiDirectional and TimeDistributed).
As you suggested, anything missing (either a custom Keras layer … or something that hasn’t been added above yet) can inherit from PrunableLayer (as suggested here).
What Sanjay said is correct also.
Based on the _update_mask function in pruning_impl.py, any weight tensor should be prunable as it simply sorts the weights and updates the mask based on the smallest value. Don’t see why this logic should be restricted to a given layer.