question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItΒ collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling.

See original GitHub issue
C:\Users\balha\AppData\Roaming\Python\Python39\site-packages\deep_daze\clip.py:38: UserWarning: C:\Users\balha/.cache/clip\ViT-B-32.pt exists, but the SHA256 checksum does not match; re-downloading the file
  warnings.warn(f"{download_target} exists, but the SHA256 checksum does not match; re-downloading the file")
Downloading ViT-B-32.pt: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 354M/354M [18:22<00:00, 321kiB/s]
C:\Users\balha\AppData\Roaming\Python\Python39\site-packages\torch\cuda\amp\grad_scaler.py:116: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.
  warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.") ```

I'm getting this Error, How do I overcome this Error I got i3 and integrated graphics, I may sound like a fool but I just wanna test it out... 
:/

Issue Analytics

  • State:open
  • Created 2 years ago
  • Comments:7 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
TaylorBurnhamcommented, May 7, 2021

@69Vishal Try and see if you have a CUDA device available for pytorch to use. Open up a Python terminal and try this, or run the one liner further down.

import torch
x = torch.cuda.get_device_name(0) if torch.cuda.is_available() else None
print(x)

One liner:

python -c "import torch; x = (torch.cuda.get_device_name(0) if torch.cuda.is_available() else None); print(x)"

This is the output for my Windows workstation for reference.

> python -c "import torch; x = (torch.cuda.get_device_name(0) if torch.cuda.is_available() else None); print(x)"
NVIDIA GeForce RTX 2070

If you have a CUDA device available it should print the name of it, otherwise pytorch isn’t seeing it. Depending on your OS there are special drivers that are not part of the standard packages you need to install to have the CUDA support.

You can learn more by looking up your device name and β€œcuda drivers,” in my case I’d google β€œRTX 2070 CUDA drivers.”

0reactions
adiv5commented, Dec 2, 2021

The issue in my case was due to my machine’s CUDA Toolkit having version 10.1 but my pytorch installation CUDA toolkit was 11.3, Installing pytorch with cuda 10.1 solved the issue

Read more comments on GitHub >

github_iconTop Results From Across the Web

Pytorch says that CUDA is not available (on Ubuntu)
PyTorch doesn't use the system's CUDA library. When you install PyTorch using the precompiled binaries using either pip or conda it isΒ ...
Read more >
Automatic Mixed Precision package - torch.amp - PyTorch
torch.amp provides convenience methods for mixed precision, ... GradScaler together, as shown in the CUDA Automatic Mixed Precision examples and CUDA ...
Read more >
Ray Train doesn't detect GPU
GradScaler is enabled, but CUDA is not available. Disabling. ... -packages/torch/cuda/amp/grad_scaler.py:116: UserWarning: torch.cuda.amp.
Read more >
User provided device_type of 'cuda', but CUDA is not ...
Hi, while training the model, I constantly get this warning: UserWarning: User provided device_type of 'cuda', but CUDA is unavailable. Disabling ...
Read more >
Four Shapes Classify Fasiai - Kaggle
Disabling ') /opt/conda/lib/python3.7/site-packages/torch/cuda/amp/grad_scaler.py:115: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found