question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

BatchNormalization's Running Stats are Accumulated in ImageNet Linear Evaluation

See original GitHub issue

Hi,

Thanks for the nice paper and clear code!

I found that the models are set with .train() in clf_linear.py. Thus the running averages (i.e., the states) of BatchNormalization layers will be accumulated when training the ImageNet datasets (via calling the forward function), and the backbone model seems not to be fully frozen. Is it a special design for this fine-tuning task?

Best, Hao

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
airsplaycommented, Oct 26, 2020

Thanks for the prompt reply. I am glad that all the experiments are not affected by this XD.

I asked the questions because the (now-fixed) adapted BN reminds an old paper thus I am not sure whether this is a common practice for domain adaption (from COCO --> IN).

(Haha. I surely trust you for all these experiments even without these histories. However, the remaining commits are definitely more convincing to everyone on the earth. )

1reaction
airsplaycommented, Oct 26, 2020

Great. I closed this issue. Thanks for all these answers and the awesome codebase/paper.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Delving into the Estimation Shift of Batch Normalization in a ...
We define the estimation shift magnitude of BN to quantitatively measure the differ- ence between its estimated population statistics and expected ones. Our ......
Read more >
Representative Batch Normalization With Feature Calibration
We will apply these statistics to calibrate the centering and scaling operations of the BatchNorm layer. 3.2.2 Centering Calibration. The running mean values ......
Read more >
The Effect of Batch Normalization on Deep Convolutional ...
Batch normalization is a recently popularized method for accelerating the training of deep feed-forward neural networks. Apart from speed improvements, ...
Read more >
Batch normalization in 3 levels of understanding
BN relies on batch first and second statistical moments (mean and variance) to normalize hidden layers activations. The output values are then ...
Read more >
Revisiting Batch Normalization for Training Low-Latency Deep ...
To address this training issue in SNNs, we revisit Batch Normalization (BN) and propose a temporal Batch Normalization Through Time (BNTT) technique.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found