Is 8G GPU enough for pytorch training?
See original GitHub issueTask (what are you trying to do/register?)
[please describe task here]
What have you tried
Please describe specifics of your approach // use of vxm
Details of experiments
Please carefully specify details about your experiments. If you are training, when what is the setup? What loss are you using? What does the convergence look like? If you are registering, please show example inputs and outputs. etc.
Issue Analytics
- State:
- Created 3 years ago
- Comments:9
Top Results From Across the Web
How much VRAM should I have for machine learning tasks?
4GB-8GB is more than enough. In the worst-case scenario, such as you have to train BERT, you need 8GB-16GB of VRAM.
Read more >Running Stable Diffusion on Your GPU with Less Than 10Gb ...
Check VRAM usage, I'm guessing you don't have 8GB free, more like 5-6GB, since you have monitors connected. Also, you could try Visions...
Read more >How to know the exact GPU memory requirement for a certain ...
I run the segmentation model inference on two GPUS: a 4G memory GPU and a 8G memory GPU. And I set different fractions...
Read more >RTX 3060 12gb vs. 3060 Ti 8gb for deep learning : r/nvidia
Maybe a good rule of thumb is to buy the GPU that fits 85-90% of your use ... I was wondering in PyTorch...
Read more >How to Train a Very Large and Deep Model on One GPU?
If we look at a bigger model, say VGG-16, using a batch size of 128 will require about 14GB of global memory. The...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
You’re the most patient writer I’ve ever met。I’ll go over the information you gave me right away Thank you
On 02/24/2021 21:42, Adrian Dalca wrote:
@klfxmu the instructions we have in the readme won’t determine if your code uses cpu or gpu, that’s up to the platform you use (e.g. keras) and the installation you have. For most people with tensorflow-gpu installed, those instructions will run voxelmorph on the gpu.
This probably has to do with your tensorflow/keras installation. So I would recommend reading into this. e.g.:
https://stackoverflow.com/questions/44829085/tensorflow-not-running-on-gpu
https://stackoverflow.com/questions/64467035/tensorflow-uses-cpu-instead-of-gpu
https://stackoverflow.com/questions/63016659/why-tensorflow-not-running-on-gpu-while-gpu-devices-are-identified-in-python
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.
@klfxmu the instructions we have in the readme won’t determine if your code uses cpu or gpu, that’s up to the platform you use (e.g. keras) and the installation you have. For most people with tensorflow-gpu installed, those instructions will run voxelmorph on the gpu.
This probably has to do with your tensorflow/keras installation. So I would recommend reading into this. e.g.:
https://stackoverflow.com/questions/44829085/tensorflow-not-running-on-gpu
https://stackoverflow.com/questions/64467035/tensorflow-uses-cpu-instead-of-gpu
https://stackoverflow.com/questions/63016659/why-tensorflow-not-running-on-gpu-while-gpu-devices-are-identified-in-python