tpu_cores=8 not working
See original GitHub issue🐛 Bug
After #2016 was fixed with PR #2033 the code is running perfectly on single tpu core and a specific tpu core but now not working with 8 tpu cores. After the training is complete getting RuntimeError: Cannot replicate if number of devices (1) is different from 8.
To Reproduce
Expected behavior
Should train with 8 tpu cores with no error just like it works in case of a single core.
Environment
- pytorch/xla: nightly
- pytorch-lightning: master
- PyTorch Version (e.g., 1.0): 1.5
- OS (e.g., Linux): Linux
- How you installed PyTorch (
conda,pip, source): pip - Python version: 3.7
Issue Analytics
- State:
- Created 3 years ago
- Comments:15 (13 by maintainers)
Top Results From Across the Web
What is the meaning of '8 core GPU'? Isn't it supposed count ...
A “CUDA core” is a concept that only applies to NVidia GPUs (CUDA refers to a proprietary API from NVidia for GPU compute)....
Read more >Purchase 8 Core or 10 Core GPU ? | MacRumors Forums
It is not worth for me, so I went with 8 core GPU. We also have 2 MBAs M1 with 7-core GPU and...
Read more >7 core GPU vs 8 core GPU : r/macbookair - Reddit
I wonder what's the difference between 7 core and 8 core. ... other apps open in the background and have seen zero slow...
Read more >How to fix the Razer Core if the GPU is not detected
More videos on YouTube · Pull the lever to unlock then pull out the compartment from the enclosure. · Check if the graphics...
Read more >New MacBook Air specs: a comical difference for a good reason
New MacBook Air specs 7-core versus 8-core GPU ... are not capable of running the machine with all 8 graphics cores without overheating....
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

may we add a test for it so we can later fix it?
I also am getting an issue using TPUs on Google Collab. Not sure what to do or how to fix it. Assuming it’s part of the Lightning package creating these issues.