EMAN2 on a remote computer

EMAN uses OpenGL for all of its graphical display windows. This is separate from the X-windows protocol which supports basic remote windowing operations. In many configurations, logging in via ssh and doing remote display will not work properly, due to broken remote OpenGL capabilities. Even if remote EMAN does work perfectly (including OpenGL) at your site, it may not be the best strategy to use for several reasons. There are two sections below. The first is a good strategy for working on clusters without a good GUI option, and the second is on how to make remote OpenGL actually work well. This also applies to users remotely accessing a workstation, not strictly clusters.

Remote GUI Display with OpenGL (TurboVNC/VirtualGL) - slow net connection, travel, etc.

Note: There are also several other software solutions to this problem, some free, some not. Some solutions, like NX and x2go have reported issues with OpenGL, such as displaying mirrored images or shifting annotations (like box locations in e2boxer) by 1/2 box size. These are bugs in the underlying remote display systems, and there are no solutions for these issues other than to use a system that isn't buggy. IMHO the solution we describe here is the best by FAR (as of 2021). It provides extremely good interactivity, and can be adjusted to prioritize either speed or display quality as needed.

This strategy can be used to share your work desktop with your laptop when traveling, or, if you don't like the idea above, to display the GUI from a cluster head-node. It requires a little one-time setup, which may require assistance of a sysadmin, but once set up works extremely well.

VNC is the standard Linux remote desktop sharing solution (different from remote X sessions). TurboVNC has the advantage of compressing data, so even over a LAN, you generally will get noticeably better performance than by using X-windows over SSH. However, regular VNC doesn't support OpenGL. Luckily the US supercomputing centers put together a solution for this, so people could display OpenGL content rendered on their large computers from across the country.

To do this, you will need: On the remote computer:

On the local computer:

Once you have installed these (you will need to follow their instructions to do this, and may need a sysadmin):

on the local machine (the password only needs to be set once):

ssh remote.machine.edu
vncserver -geometry 1920x1080

(Make sure you don't run the vncserver command as root as you may lock yourself out of Xsessions). Log out of the remote machine, then again on the local machine:

vncviewer -tunnel username@remote.machine.edu:1
vncviewer -via username@remote.machine.edu

This will open a remote desktop. Open a terminal window in the remote desktop. If you wish to be able to use OpenGL programs from the terminal, you then need to run:

vglrun bash
conda activate

Of course you can replace 'bash' with whatever shell you like. From within that new shell you will be able to run OpenGL programs like e2display.py and have them work properly. You should also find that the display is much more interactive than a normal remote SSH session GUI.

If the vncviewer command above doesn't work, or you didn't install the client locally, this is an alternative:

ssh username@remote.machine.edu -L5901:localhost:5901

Then use your local VNC client to connect to: vnc://localhost:5901

If you use the turboVNC client, note that there is a little configuration button on the upper right of the remote display window, which allows you to enable inter-frame compression and adjust compression level. The default settings are likely fine for any reasonable speed network connection, but if you're on a slow link, lowering the quality can give better interactivity, and if you want to do screenshots, you should probably set the quality close to max.

Our recommended strategy for running the EMAN GUI tools on a cluster in this situation is "Don't". Why? :

  1. Remote display of graphics-intensive windows using X-windows via SSH is slow and can be resource-intensive on the cluster head-node
  2. Due to the way disks are shared on clusters, if you have a job running and use the GUI in the same project, there are some seemingly innocuous things which can cause jobs to crash and potentially mess other things up. These issues don't generally occur when running locally on a desktop machine.
  3. If something untoward were to happen, you may not have a readily available backup of your project

However, if you are trying to see what's going on at work over a slow network connection from home, or somesuch, the remote OpenGL in the section section below is a better approach. The approach here is suggested when you have a workstation on a high speed network, and are accessing a Linux cluster.

There is a standard Unix tool called 'rsync' which permits you to duplicate an entire tree of folders and files either locally or on a remote machine. The beautiful thing about rsync is that it only copies files which have changed, so it is widely used for making efficient backups, etc.

Our suggested strategy is:

  1. perform all GUI preprocessing work on a local workstation
  2. use the GUI on the workstation to construct the command you need to run on the cluster (the 'Command' tab in the GUI will show this)
  3. rsync the whole project to the cluster (see below)
  4. run your job via the normal queuing system on the cluster
  5. any time you want to check the results with the GUI, rsync the project from the cluster back to your local machine. This can be done safely even while the job is running.
  6. If you need to modify things in the project locally and send the files back to the cluster, either make the changes on the cluster or make sure you:
    • wait for the cluster job to finish
    • rsync from the cluster back to the local machine
    • modify the local files as necessary
    • rsync the local files back to the cluster

rsync is pretty easy to use once you get the hang of the options. There are plenty of good resources on the web to learn about it. Consider the following situation:

To rsync the project to the cluster. On the local maching:

cd /home/stevel
rsync -avr myproject stevel@cluster.bcm.edu:/raid/stevel

This will create a directory called /raid/stevel/myproject on cluster.bcm.edu. This may take some time the first time you run it, since it is copying everything in the project. There are variants to the command to only copy the necessary folders to run jobs (particles, sets, info), but you must really know what you're doing to try this without causing problems.

To rsync results back to the local machine, run this on the local machine:

cd /home/stevel
rsync -avr stevel@cluster.bcm.edu:/raid/stevel/myproject /home/stevel

EMAN2/Remote (last edited 2023-03-16 15:34:56 by SteveLudtke)