question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Batch norm not freezed in train?

See original GitHub issue

I have a problem with training. Model gives good results on train data and absolutely trash on test data, even in case train and test data is equal. I try different learning rates and backbones, and have strange behavior.

I suppose that batchnorms not freezed in train\eval time with freeze_bn flags, because when we do model.train or model.eval it changes batchnorm behaviour. I think we can easy imporve that with

...
        self.bn_freezed = freeze_bn
        if freeze_bn:
            self.freeze_bn()

    def forward(self, input):
        if self.bn_freezed:
            self.freeze_bn()
        x, low_level_feat = self.backbone(input)
        x = self.aspp(x)
        x = self.decoder(x, low_level_feat)
        x = F.interpolate(x, size=input.size()[2:], mode='bilinear', align_corners=True)

        return x
...

May be you know more beatuiful way to freeze bn and not freeze other layers, it will be cool if you share it.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
vazyavazyacommented, Dec 3, 2019

@jaemin93 yes, it how batchnorm works, but when you use flag freeze_bn == True, it is not freezed in this original code in this repo, it become unfreezed if you call model.train() and statistic become shifted. When I use code my abowe, it improve quality on evaluation, because I have small batch size, and I want to train with absolutely freezed batchnorms.

0reactions
jaemin93commented, Dec 10, 2019

@jfzhang95 Thank you!

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to freeze BN layers while training the rest of network ...
I have a network that consists of batch normalization (BN) layers and other layers (convolution, FC, dropout, etc)
Read more >
Why are BatchNorm layers set to trainable in a frozen model
Here is my understanding. During transfer learning, first thing we did in fastai is freeze the backbone and only train the custom head....
Read more >
python - Why it's necessary to frozen all inner state of a Batch ...
Therefore, if batch normalization is not frozen, the network will learn new batch normalization parameters (gamma and beta in the batch ...
Read more >
Batchnorm and freezing layers/finetuning : r/computervision
Hello guys! I was wondering, how you handle Batchnorm layers when finetuning/freezing part of the network. Do you set them in train mode...
Read more >
To freeze or not, batch normalisation in ResNet when transfer ...
Few layers such as Batch Normalization (BN) layers shouldn't be froze because, the mean and variance of the dataset will be hardly matching...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found