question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Are all keras layers supported for weight pruning?

See original GitHub issue

Hi, thanks for a very convenient package! Are the layers like LSTM, ConvLSTM2D and TimeDistributed supported for weight pruning? In the other issue, someone had a problem with TimeDistributed layer. If they are not supported out of the box, would inheriting from PrunableLayer and implementing get_prunable_weights() (perhaps as return empty list) fix the issue?

Thanks!

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
alanchiaocommented, Jul 31, 2019

HI Anuar,

We should be supporting all builtin-Keras layers out-of-the-box, though in the implementation, we need to add them one-by-one (see this). Keras wrappers are a bit different, and in as you can see in the other issue, they aren’t supported yet (BiDirectional and TimeDistributed).

As you suggested, anything missing (either a custom Keras layer … or something that hasn’t been added above yet) can inherit from PrunableLayer (as suggested here).

What Sanjay said is correct also.

1reaction
s36srinicommented, Jul 25, 2019

Based on the _update_mask function in pruning_impl.py, any weight tensor should be prunable as it simply sorts the weights and updates the mask based on the smallest value. Don’t see why this logic should be restricted to a given layer.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Pruning comprehensive guide - Model optimization
Prune whole model (Sequential and Functional); Prune some layers ... Welcome to the comprehensive guide for Keras weight pruning.
Read more >
Weight Pruning with Keras
In this blog, we will be understanding the concept of weight pruning with Keras. Basically, weight pruning is a model optimization technique. ...
Read more >
How to prune certain weights (rather than freeze a layer) ...
So how to prune these connections (freeze their weights to 0 during training) in Keras? Do I need to writing my own layers...
Read more >
Keras layers API
Layers are the basic building blocks of neural networks in Keras. ... call method) and some state, held in TensorFlow variables (the layer's...
Read more >
Making Neural Networks Smaller for Better Deployment
This is important to take into account as most deep learning frameworks, including Keras, don't support sparse weight layers.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found