question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Disable warning for No GPU/TPU found

See original GitHub issue

Minor idea - would be nice to be able to disable this warning for debugging ease:

WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)

The only place I found a reference to this was in this colab: https://colab.research.google.com/github/google/jax/blob/master/docs/notebooks/quickstart.ipynb#scrollTo=ehUS7s8xKkxK

# Execute this to consume & hide the GPU warning.
jnp.arange(10)

Which doesn’t work outside of a notebook.

I tried using export TF_CPP_MIN_LOG_LEVEL=3, import os; os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3' - any other ideas how to suppress?

Thanks so much!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:14 (9 by maintainers)

github_iconTop GitHub Comments

6reactions
mattjjcommented, May 21, 2021

If you intend to use the CPU, that warning is a bit annoying!

Here are two ways to suppress it:

  1. set the shell environment variable JAX_PLATFORM_NAME=cpu
  2. near the top of your main file, write import jax; jax.config.update('jax_platform_name', 'cpu')

Either of those things will tell JAX that you intend to use the CPU, so it won’t warn about only having a CPU available.

3reactions
bwohlbergcommented, Apr 14, 2022

Could you please consider adding a mechanism for suppressing the warning that does not also disable GPU usage? The warning makes it quite painful to write warning-free code for both CPU and GPU platforms, and it’s an absolute nightmare when using jax with ray because the warning gets spewed with every new process.

Or even better, just remove the warning. Considering that jax is supported on CPU-only platforms, so that usage without a GPU available is not unusual usage, it’s not even clear that the warning actually fulfills any useful purpose.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Trainer — PyTorch Lightning 1.8.5.post0 documentation
Supports passing different accelerator types ( "cpu", "gpu", "tpu", "ipu", "auto" ) as well as custom accelerator instances. # CPU accelerator trainer =...
Read more >
disable tensorflow warnings and messages - Stack Overflow
I am using tensorflow==2.10.0 and have been trying to remove these messages which pop up everytime I run my python script.
Read more >
trax-ml/community - Gitter
I was trying to use trax.fastmath for some vector computation and I got the warning - No GPU/TPU found, falling back to CPU....
Read more >
Troubleshooting TensorFlow - TPU - Google Cloud
When training a neural network on a CPU, GPU, or TPU, the memory use comes from two places: The memory use is proportional...
Read more >
Trainer - Hugging Face
The API supports distributed training on multiple GPUs/TPUs, mixed precision ... If the callback is not found, returns None (and no error is...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found