Error on calling nvidia-smi: Command 'ps ...' returned non-zero exit status 1
See original GitHub issuegot above error msg when i run gpustat. but nvidia-smi works on my machine here are some details OS:Ubuntu 14.04.5 LTS Python Version: anaconda3.6
Error on calling nvidia-smi. Use --debug flag for details
Traceback (most recent call last):
File "/usr/local/bin/gpustat", line 417, in print_gpustat gpu_stats = GPUStatCollection.new_query()
File "/usr/local/bin/gpustat", line 245, in new_query
return GPUStatCollection(gpu_list)
File "/usr/local/bin/gpustat", line 218, in __init__
self.update_process_information()
File "/usr/local/bin/gpustat", line 316, in update_process_information
processes = self.running_processes()
File "/usr/local/bin/gpustat", line 275, in running_processes
','.join(map(str, pid_map.keys()))
File "/usr/local/bin/gpustat", line 46, in execute_process
stdout = check_output(command_shell, shell=True).strip()
File "/home/xiyun/apps/anaconda3/lib/python3.6/subprocess.py", line 336, in check_output
**kwargs).stdout
File "/home/xiyun/apps/anaconda3/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command 'ps -o pid,user:16,comm -p1 -p 14471' returned non-zero exit status 1.
how can i fix this ?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:1
- Comments:20 (10 by maintainers)
Top Results From Across the Web
Command '['nvidia-smi', '-L']' returned non-zero exit status 255
Please provide the following information when requesting support. ... ~/.tao_mounts.json wasn't found. Falling back to obtain mount points and ...
Read more >Command '[...]' returned non-zero exit status 1 - Stack Overflow
As your error message said, ping finished with non zero exit status. It might mean that e.g. the IP address ...
Read more >Error: Command '/usr/bin/python' returned non-zero exit status 1
Hi, My python version is 2.7.3. I am using Ubuntu 12.04.1 LTS. I am trying to download the webrtc code base using 'gclient...
Read more >nvidia-smi - NVIDIA System Management Interface program
nvidia -smi (also NVSMI) provides monitoring and management capabilities for ... Return code 0 - Success · Return code 2 - A supplied...
Read more >Troubleshooting GCP + CUDA/NVIDIA + Docker and Keeping ...
NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver. Make sure that the latest NVIDIA driver is installed and running. This...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

sorry about that, ~V~. gpustat --debug
Released as v0.4.1.