If I use only two GPUs for training, Can it be worse training than original training?
See original GitHub issueTo the best of my knowledge about this module, If I code on train.py like:
gpu_ids = 0,1
,
batch_size = 32
Then it works like batch_size = 64
(Is it right?)
But in your paper, batch_size = 128, and used 4 GPU.
So I’m worried that It(two GPUs, batch_size=32) can make a difference with the original training.
So what should I do, If I want to train like your original paper with only two GPUs?
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
How to scale training on multiple GPUs - Towards Data Science
In this blog post, I will go over how to scale up training with PyTorch. We've had some models in TensorFlow (<2.0) and...
Read more >Why My Multi-GPU training is slow? | by Chuan Li | Medium
One common question we got from our customers is: “Why training with multiple GPUs is not faster on my machine?”. For example:.
Read more >multi gpu training is worse than single gpu #1198 - GitHub
I have tried experimenting with decreasing learning rate, but I see almost identical results regardless of LR (with ADAM). I'm using keras. This ......
Read more >Efficient Training on Multiple GPUs - Hugging Face
When training on a single GPU is too slow or the model weights don't fit in a single GPUs memory we use a...
Read more >Why and How to Use Multiple GPUs for Distributed Training
Buying multiple GPUs can be an expensive investment but is much faster than other options. CPUs are also expensive and cannot scale like...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Yes.
It seems you asked bunch of questions recently. I’d like to say that asking some difficulties after checking your questions on your side would be much more beneficial for you to strengthen your implementation/research power (and also for me because I cannot answer all of your questions). Also, giving some thank you messages when you got answers (especially when asking bunch of questions) can relieve some tiresome of answerers.
I had no offensive meanings. Sorry if it sounds like that and ask me if you have further questions. Good luck!