Can we prune pre-trained model like VGG16 etc... using this optimization library
See original GitHub issueI tried to create a model like:
`def Vgg16():
vgg16 = VGG16(include_top=False,
weights='imagenet',
input_shape = (32, 32, 3))
top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16.output_shape[1:]))
top_model.add(Dense(512, activation='relu'))
top_model.add(Dropout(0.5))
top_model.add(Dense(256, activation='relu'))
top_model.add(Dropout(0.5))
top_model.add(Dense(10, activation='sigmoid'))
model = Model(vgg16.input,top_model(vgg16.output))
return model`
and when I call
`new_pruning_params = {
'pruning_schedule': sparsity.PolynomialDecay(initial_sparsity=0.5,
final_sparsity=0.80,
begin_step=0,
end_step=end_step,
frequency=100)
}
**pruned_model = sparsity.prune_low_magnitude(loaded_model, **new_pruning_params)`**
it generates error as:
Please initialize Prune with a supported layer. Layers should either be a PrunableLayer instance, or should be supported by the PruneRegistry. You passed: <class ‘tensorflow.python.keras.engine.sequential.Sequential’>
Issue Analytics
- State:
- Created 4 years ago
- Comments:13 (4 by maintainers)
Top Results From Across the Web
Pruning in Keras example | TensorFlow Model Optimization
Fine-tune pre-trained model with pruning You will apply pruning to the whole model and see this in the model summary. In this example,...
Read more >Pruning deep neural networks to make them fast and small
Oracle pruning VGG16 has 4224 convolutional filters. The “ideal” ranking method would be brute force - prune each filter, and then observe how...
Read more >Hands-on Transfer Learning with Keras and the VGG16 Model
In this approach, we employ a strategy called Fine-Tuning. The goal of fine-tuning is to allow a portion of the pre-trained layers to...
Read more >Efficient Pruning Methods of Deep CNN Networks - PMC - NCBI
Pruning with No Retraining ... After the process of training neural model we acquire a set of weights for each trainable layer. These...
Read more >ACP: Automatic Channel Pruning via Clustering and Swarm ...
architecture search can guide to optimize the network pruning. ... from the dataset using the pre-trained baseline CNN. The.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

In general, yes you can.
There are some caveats (e.g. lack of subclassed model support / nesting of models within models like in both examples (tejalal@ and Cospel@). Created https://github.com/tensorflow/model-optimization/issues/155 in light of this for making subclassed support better.
Hi everyone 😃 I have a similar issue with pruning nested models, even if I apply the pruning wrappers per layer inside all the nested Functional API models, they don’t prune.
Is this expected behaviour at all for nested models? Because I would think that if any layer in a model has that wrapper, then it will be pruned when the pruning callback is called in the training phase. Unfortunately, this does not happen. Instead everything not nested (that have pruning wrappers) do prune, and anything inside a nested model does not.
I can also confirm that if I create a model with no nested models at all, then everything I set to prune does in fact prune the way it should.
Side note: My nested model is a pretrained VGG16 from keras and I apply pruning wrappers to each layer within the nested model.
If anyone perhaps have a solution to this or workaround that would seriously be very helpful, thank you.