Running with custom docker shm-size flag
See original GitHub issueUsually, when using PyTorch Dataset and Dataloader classes, you need to run your container with this flag
docker run --runtime=nvidia ... --shm-size 16G
More information about this Is there any way I can run a Kaggle kernel with this flag?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:7
- Comments:7 (3 by maintainers)
Top Results From Across the Web
How to increase the size of the /dev/shm in docker container
If you're using docker-compose, you can set the your_service.shm_size value if you want your container to use that /dev/shm size when running or ......
Read more >Allow setting of shm-size for docker service create #26714
As i am using swarm mode and service I cannot pass --shm_size=1g It always take the default value. Running individual containers is not...
Read more >Compose file version 3 reference - Docker Documentation
As with docker run , options specified in the Dockerfile, such as CMD , EXPOSE ... Specify a custom container name, rather than...
Read more >Gitlab CI services to support shm_size (#4475) · Issues
The docker container will fail if the size is below 1 GB. I can see from build log, the value is default 64...
Read more >Docker driver - Nomad - HashiCorp Developer
image - The Docker image to run. The image may include a tag or custom URL and should include https:// if required. By...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
We will release a fix for this issue next week. Thank you for your patience.
Any progress on this? This issue is the only thing stopping me from using Kaggle Kernels.