question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

I’m trying to get some insight into batch sizes and whether or not the performance I’m seeing is expected. It seems that I can’t set batch sizes much more than say, 32, w/ my dual Titan Xs. It’s further my understanding that dataparallel will split that batch of 32 across the two GPUs for an effective batch size of 16 per gpu per batch. The model I’m training is all default: 4 LSTM layers w/ 400 hidden units. Now this is a fair amount different than many of the DeepSpeech 2 configurations in the paper, but I am seeing references to them having batch sizes of 512 spread over 8 Titan X’s. This implies that whatever system they’re running allows them to support batches of 64 per gpu. Seems to me we should be able to get closer to this number unless I’m missing something. Any thoughts?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
ryanlearycommented, Apr 20, 2018

We filter the data used to under a certain length. If you look at the librispeech.py script, for example, you can see that there is a flag there for doing filtering when the manifest is created.

0reactions
haquynh1505commented, Apr 20, 2018

@ryanleary : Ok I see. Thank you so much!

Read more comments on GitHub >

github_iconTop Results From Across the Web

What is batch size in neural network? - Cross Validated
The batch size defines the number of samples that will be propagated through the network. For instance, let's say you have 1050 training...
Read more >
Difference Between a Batch and an Epoch in a Neural Network
The batch size is a number of samples processed before the model is updated. The number of epochs is the number of complete...
Read more >
Batch Size in a Neural Network explained - deeplizard
Put simply, the batch size is the number of samples that will be passed through to the network at one time. Note that...
Read more >
Batch size (machine learning) | Radiology Reference Article
Batch size is a term used in machine learning and refers to the number of training examples utilized in one iteration. The batch...
Read more >
How does Batch Size impact your model learning - Medium
Batch Size is among the important hyperparameters in Machine Learning. It is the hyperparameter that defines the number of samples to work through...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found