question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Performance of batch_size > 1

See original GitHub issue

Theoretically batch_size > 1 should be working, practically however performance appears to degrade. I’ve looked at the data generator and loss functions but everything appears to be fine.

I’m not sure where the degradation of performance comes from, perhaps a fresh set of eyes can help uncover the issue? My intuition expects the problem to be in the loss function, or a deeper issue in Keras / Tensorflow perhaps.

In extension, this also breaks multi GPU support since that requires batch_size > 1.

@awilliamson I think you ran some tests on this right? Do you still have them stored somewhere? Can you share them?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:25 (17 by maintainers)

github_iconTop GitHub Comments

3reactions
sorinpandurucommented, Mar 27, 2018

Hi! I can fire up training with batch size 16 today. Will post here if i see anything odd. Thanks!

1reaction
yhenoncommented, Apr 20, 2018

Did you scale the number of steps with the batch size?

Indeed I did. I ensured that the same number of images were trained on, to have a fair comparison.

Other than that, does this conclude that batch_size > 1 is working?

Yeah I believe so, and I suggest we close this.

If someone has issues with batch_size>1, they can reopen this issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Control the Stability of Training Neural Networks With ...
Batch size controls the accuracy of the estimate of the error gradient when training neural networks. Batch, Stochastic, and Minibatch gradient ...
Read more >
Deep Learning Performance 1 Batch Size, Epochs ... - SROSE
Batch size controls the accuracy of the estimate of the error gradient when training neural networks. There is a tension between batch size...
Read more >
The Challenge of Batch Size 1: Groq Adds Responsiveness to ...
However, small batch sizes and batch size 1 introduce a number of performance and responsiveness complexities to machine learning applications, particularly.
Read more >
Why Mini-Batch Size Is Better Than One Single ... - Baeldung
With a batch size of 27000, we obtained the greatest loss and smallest accuracy after ten epochs. This shows the effect of using...
Read more >
How does Batch Size impact your model learning - Medium
Batch Size is among the important hyperparameters in Machine Learning. ... This makes it pretty clear that increasing batch size lowers performance.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found