question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

GCP tutorial suggests using T4 GPU to save costs, but fails when using T4 GPU

See original GitHub issue

Update: GCP tutorial suggests using T4 GPU to save costs, but fails when using T4 GPU (error below)


Hi, I am following the tutorial Training an Obstacle Tower agent using Dopamine and the Google Cloud Platform

I am getting the following error - I believe the problem is (EE) NVIDIA(GPU-0): UseDisplayDevice "None" is not supported with GRID - but I’m not sure of the root cause.

I was trying to use the T4 GPU to save $$ - I will try again with the default GPU

image

after typing

sudo /usr/bin/X :0 &
export DISPLAY=:0

I get this error

X.Org X Server 1.19.2
Release Date: 2017-03-02
X Protocol Version 11, Revision 0
Build Operating System: Linux 4.9.0-8-amd64 x86_64 Debian
Current Operating System: Linux tensorflow-1-vm 4.9.0-8-amd64 #1 SMP Debian 4.9.130-2 (2018-10-27) x86_64
Kernel command line: BOOT_IMAGE=/boot/vmlinuz-4.9.0-8-amd64 root=UUID=995b3d50-0ab0-4faa-8296-ab743ab0fde7 ro net.ifnames=0 biosdevname=0 console=ttyS0,38400n8 elevator=noop scsi_mod.use_blk_mq=Y
Build Date: 03 November 2018  03:09:11AM
xorg-server 2:1.19.2-1+deb9u5 (https://www.debian.org/support) 
Current version of pixman: 0.34.0
	Before reporting problems, check http://wiki.x.org
	to make sure that you have the latest version.
Markers: (--) probed, (**) from config file, (==) default setting,
	(++) from command line, (!!) notice, (II) informational,
	(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(==) Log file: "/var/log/Xorg.0.log", Time: Thu Feb 14 01:06:15 2019
(==) Using config file: "/etc/X11/xorg.conf"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
(EE) 
Fatal server error:
(EE) no screens found(EE) 

/var/log/Xorg.0.log

[   385.871] (II) Module "ramdac" already built-in
[   385.877] (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
[   385.877] (==) NVIDIA(0): RGB weight 888
[   385.877] (==) NVIDIA(0): Default visual is TrueColor
[   385.877] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[   385.877] (**) NVIDIA(0): Option "UseDisplayDevice" "None"
[   385.877] (**) NVIDIA(0): Enabling 2D acceleration
[   385.877] (**) NVIDIA(0): Option "UseDisplayDevice" set to "none"; enabling NoScanout
[   385.877] (**) NVIDIA(0):     mode
[   385.877] (II) Loading sub module "glxserver_nvidia"
[   385.877] (II) LoadModule: "glxserver_nvidia"
[   385.877] (II) Loading /usr/lib/xorg/modules/extensions/libglxserver_nvidia.so
[   385.882] (II) Module glxserver_nvidia: vendor="NVIDIA Corporation"
[   385.882]    compiled for 4.0.2, module version = 1.0.0
[   385.882]    Module class: X.Org Server Extension
[   385.882] (II) NVIDIA GLX Module  410.72  Wed Oct 17 20:11:21 CDT 2018
[   386.482] (EE) NVIDIA(GPU-0): UseDisplayDevice "None" is not supported with GRID
[   386.482] (EE) NVIDIA(GPU-0):     displayless
[   386.482] (EE) NVIDIA(GPU-0): Failed to select a display subsystem.
[   386.563] (EE) NVIDIA(0): Failing initialization of X screen 0
[   386.563] (II) UnloadModule: "nvidia"
[   386.563] (II) UnloadSubModule: "glxserver_nvidia"
[   386.563] (II) Unloading glxserver_nvidia
[   386.563] (II) UnloadSubModule: "wfb"
[   386.563] (II) UnloadSubModule: "fb"
[   386.563] (EE) Screen(s) found, but none have a usable configuration.
[   386.563] (EE)
Fatal server error:
[   386.563] (EE) no screens found(EE)
[   386.563] (EE)
Please consult the The X.Org Foundation support
         at http://wiki.x.org
 for help.
[   386.563] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[   386.563] (EE)
[   386.564] (EE) Server terminated with error (1). Closing log file.

Issue Analytics

  • State:open
  • Created 5 years ago
  • Comments:10 (2 by maintainers)

github_iconTop GitHub Comments

12reactions
zhenghongzhicommented, Feb 28, 2019

I find the solution and it works for me:

delete or comment(with “#”) ServerLayout and Screen section in /etc/X11/xorg.conf file

6reactions
juge2commented, Jul 16, 2019

For me only removing Option "UseDisplayDevice" "none" in “Screen” Section does also the trick.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Optimizing GPU performance - Compute Engine
You can use the following options to improve the performance of GPUs on virtual machine (VM) instances : On VMs that use NVIDIA...
Read more >
NGC Catalog User Guide
This user guide details how to navigate the NGC Catalog and ... Supported GPUs include H100, V100, A100, T4, Jetson, and the RTX...
Read more >
GCP
If you have a tight budget you might want to go with a cheaper setup. In this case, we suggest a n1-highmem-4 instance...
Read more >
How to Run a Stable Diffusion Server on Google Cloud ...
Select NVIDIA Tesla T4 — this is the cheapest GPU and it does the job (it has ... cost for a VM with...
Read more >
500 Hours of Free, 4K 60 FPS Cloud Gaming | by John Ragone
The first thing you'll want to do with your new account is increase some quotas. The machine we will provision requires an NVIDIA...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found