![]() ![]() If local logins are desired on a VGL server, best practices are to use a dedicated low-end GPU for the local console and a dedicated high-end GPU, configured for headless operation, for VGL. Most systems that are configured for use as VirtualGL servers aren't used for local logins.īest practices are to not allow local logins on the 3D X server, because when someone logs in and out locally, that will reset the 3D X server, thus causing any applications currently running with VirtualGL to abort. Unfortunately, that disables local Wayland logins, but: (With GDM/Wayland, the 3D X server either isn't running at the login prompt, or it is running on top of Wayland and thus can't be assigned the proper permissions for use with VirtualGL.) VGL 2.6.2 introduced changes to vglserver_config that work around this problem by forcing GDM to use X11 rather than Wayland. The OpenGL error you initially reported was very likely due to the fact that modern GDM releases use Wayland by default, which means that VirtualGL can't connect to the 3D X server while the 3D X server is sitting at the login prompt. There are three separate issues at work, here. ![]() | GPU GI CI PID Type Process name GPU Memory | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |