question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItΒ collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

No GPU/TPU found, falling back to CPU

See original GitHub issue

Here’s the full warning that I get (I do have a GPU):

>>> import diffrax
2022-03-24 16:30:19.350737: I external/org_tensorflow/tensorflow/compiler/xla/service/service.cc:170] XLA service 0x55795c0d4670 initialized for platform Interpreter (this does not guarantee that XLA will be used). Devices:
2022-03-24 16:30:19.350761: I external/org_tensorflow/tensorflow/compiler/xla/service/service.cc:178]   StreamExecutor device (0): Interpreter, <undefined>
2022-03-24 16:30:19.353414: I external/org_tensorflow/tensorflow/compiler/xla/pjrt/tfrt_cpu_pjrt_client.cc:169] TfrtCpuClient created.
2022-03-24 16:30:19.353886: I external/org_tensorflow/tensorflow/stream_executor/tpu/tpu_platform_interface.cc:74] No TPU platform found.
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)

Edit: I installed diffrax from conda-forge.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:11 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
patrick-kidgercommented, Mar 26, 2022

I’m going to close this issue as it’s JAX-related rather than Diffrax-related, but feel free to keep discussing things afterwards.

1reaction
ma-sadeghicommented, Mar 24, 2022

@dhirschfeld @patrick-kidger Thank you guys for your quick reply! Indeed, it was a jax issue. I created a fresh environment, installed jax and diffrax, also had to install cudatoolkit-dev from conda-forge, and finally I was able to get it working on GPU.

mamba list jax

# Name                    Version                   Build  Channel
jax                       0.3.4              pyhd8ed1ab_0    conda-forge
jaxlib                    0.3.2+cuda11.cudnn82          pypi_0    pypi
mamba list diffrax

# Name                    Version                   Build  Channel
diffrax                   0.0.5              pyhd8ed1ab_0    conda-forge
                  __    __    __    __
                 /  \  /  \  /  \  /  \
                /    \/    \/    \/    \
β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ/  /β–ˆβ–ˆ/  /β–ˆβ–ˆ/  /β–ˆβ–ˆ/  /β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ
              /  / \   / \   / \   / \  \____
             /  /   \_/   \_/   \_/   \    o \__,
            / _/                       \_____/  `
            |/
        β–ˆβ–ˆβ–ˆβ•—   β–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ•—   β–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—
        β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—
        β–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘
        β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘
        β–ˆβ–ˆβ•‘ β•šβ•β• β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘ β•šβ•β• β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘
        β•šβ•β•     β•šβ•β•β•šβ•β•  β•šβ•β•β•šβ•β•     β•šβ•β•β•šβ•β•β•β•β•β• β•šβ•β•  β•šβ•β•

        mamba (0.19.1) supported by @QuantStack

        GitHub:  https://github.com/mamba-org/mamba
        Twitter: https://twitter.com/QuantStack

β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ


     active environment : sci
    active env location : /home/amin/mambaforge/envs/sci
            shell level : 2
       user config file : /home/amin/.condarc
 populated config files : /home/amin/mambaforge/.condarc
                          /home/amin/.condarc
          conda version : 4.11.0
    conda-build version : not installed
         python version : 3.9.6.final.0
       virtual packages : __cuda=11.4=0
                          __linux=5.13.0=0
                          __glibc=2.31=0
                          __unix=0=0
                          __archspec=1=x86_64
       base environment : /home/amin/mambaforge  (writable)
      conda av data dir : /home/amin/mambaforge/etc/conda
  conda av metadata url : None
           channel URLs : https://conda.anaconda.org/conda-forge/linux-64
                          https://conda.anaconda.org/conda-forge/noarch
          package cache : /home/amin/mambaforge/pkgs
                          /home/amin/.conda/pkgs
       envs directories : /home/amin/mambaforge/envs
                          /home/amin/.conda/envs
               platform : linux-64
             user-agent : conda/4.11.0 requests/2.26.0 CPython/3.9.6 Linux/5.13.0-35-generic ubuntu/20.04.4 glibc/2.31
                UID:GID : 1000:1000
             netrc file : None
           offline mode : False
Read more comments on GitHub >

github_iconTop Results From Across the Web

WARNING - No GPU/TPU found, falling back to CPU. (Set ...
I am pretty sure that the box I am connecting to via ssh has 20 CPU cores and 2 GPU cores however the...
Read more >
Troubleshooting JAX - TPU - Google Cloud
Everything will be run on the TPU as long as JAX doesn't print "No GPU/TPU found, falling back to CPU." You can verify...
Read more >
Why is the TPU not recognized on my Google Cloud TPU VM ...
Am having the same problem, am getting TPU platform initialization failed: NOT_FOUND: No ba16c7433 device found. – Chris Flesher. Sep 19 at 11:Β ......
Read more >
Code 10: Probabilistic Programming Languages
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.).
Read more >
trax-ml/community - Gitter
I was trying to use trax.fastmath for some vector computation and I got the warning - No GPU/TPU found, falling back to CPU....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found