question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

shouldn't model.trainable=False freeze weights under the model?

See original GitHub issue

I am trying to freeze the free trained VGG16’s layers (‘conv_base’ below) and add new layers on top of them for feature extracting. I expect to get same prediction results from ‘conv_base’ before(ret1) / after(ret2) fit of model but it is not. Is this wrong way to check weight freezing?

# loading VGG16 and set to untrainable conv_base = applications.VGG16(weights='imagenet', include_top=False, input_shape=[150, 150, 3]) conv_base.trainable = False

#result before model fit ret1 = conv_base.predict(np.ones([1, 150, 150, 3]))

# add layers on top of the VGG16 and compile a model model = models.Sequential() model .add(conv_base) model .add(layers.Flatten()) model .add(layers.Dense(10, activation='relu')) model .add(layers.Dense(1, activation='sigmoid')) model.compile('rmsprop', 'binary_crossentropy', ['accuracy'])

# fit the model model.fit_generator(train_generator, 100, validation_data=validation_generator, validation_steps=50)

#result after model fit ret2 = conv_base.predict(np.ones([1, 150, 150, 3]))

#hope this is True but it is not. np.equal(ret1, ret2)

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:1
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
anujgupta82commented, Jun 26, 2018

if you set model. trainable = False, should it not make layer.trainable for all layers false?

conv_base_model = VGG16(weights='imagenet', input_shape=(150, 150, 3), include_top=False)

conv_base_model.trainable = False

for layer in conv_base_model.layers:
    print(layer.name, layer.trainable)

I am still getting true for all layers. keras

Am I missing something?

Read more comments on GitHub >

github_iconTop Results From Across the Web

shouldn't model.trainable=False freeze weights under the ...
This is an interesting case. Why something like this happen is caused by the following thing: You cannot freeze a whole model after ......
Read more >
Transfer learning and fine-tuning | TensorFlow Core
Instantiate a base model and load pre-trained weights into it. Freeze all layers in the base model by setting trainable = False ....
Read more >
Python freeze layers - ProgramCreek.com
This page shows Python code examples for freeze layers. ... for l in model.layers: if len(l.trainable_weights): trainable = (type(l) in unfrozen_types or ...
Read more >
How to freeze model parameters? - Google Groups
If so, it is fairly easy to freeze weights of given layers by setting their `trainable` property. See the following example (available in...
Read more >
Core Layers - Keras Documentation
trainable : boolean. Set to "False" before model compilation to freeze layer weights (they won't be updated further during training). input_shape: a tuple...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found