Using X11 forwarding on a compute node (or job)

It seems SLRUM does not support X11 forwarding by setting the --x11 option when running salloc or srun command. I also tried to access a job node via ssh with -X option to allow x11 forwarding, but it gives X11 forwarding request failed on channel 0 error.

Is anyone who runs GUI app using an X server client, or is there a guide for this?

We do not enable X11 forwarding to compute nodes. To use interactive GUI apps on compute nodes, we have the OnDemand service instead:

What GUI app are you trying to use? You may be able to use the GUI Terminal app to load and launch it. If it requires a special setup, then we could add a custom app for it.

Thank you for the response!

I am using Carla simulator that is an Unreal-based simulator, and it is locally installed in my home directory. I have used the simulator in the offscreen rendering mode where it runs a simulation without visualizing it, but sometimes I need to visually check the simulation result.

How can I run GUI app on GUI terminal app you mentioned? I have tried launching xterm on login server, running salloc and launching the app, but it was not successful. Once I get into a node, there is no environment variable DISPLAY.

From the OnDemand dashboard, go to Interactive Apps > Terminal and then fill out the form and submit the job. Once the job is ready, click on Launch Terminal. When the windowing environment pops up with a terminal app, try running the software launch command from within the terminal, for onscreen mode in this case.

Thank you for the guide. I was able to run GUI terminal on OnDemand.

However, I still couldn’t launch the simulator since singularity command, which is used for setting up the execution environment for it, is unavailable for some reason. When I directly access the node via ssh, not by OnDemand, I can use the singularity command.

I tried dumping the container file (aka SIF file), extracting (almost) all dependent libraries to the simulator, and launching it after setting LD_LIBRARY_PATH to link with the extract libraries, but it wasn’t successful. Also, I could not load singularity as a module by module load singularity.

Is there anything I can try?

The OnDemand app itself is running inside a Singularity container, so that’s why the singularity command is not available.

I see in the CARLA docs that the latest version of the simulator requires a GPU for rendering as well as Vulkan drivers to be installed. We don’t have Vulkan drivers installed on our GPUs, and the GPUs are also not configured for display rendering.

We have a VM service available that you could use as a remote desktop and connect to your HPC directories, but GPUs are not available for that service.

If you are able to use a previous version of CARLA configured with OpenGL, then it may work via Mesa software rendering. But perhaps not worth the effort.

I see. Thank you for all your help!