Everyone has heard of Docker and the container revolution it has sparked. Using Docker to containerize server applications has revolutionized how applications are deployed in the Enterprise, due to increased speed and consistency without regard to underlying distribution or architecture. However, not many are really looking at how Docker can improve your desktop experience as much as your server deployments. Let’s take a look at some Docker desktop containers for popular applications and why we might use them.
Containerize All the Things
Why in the world would you want to run your desktop GUI applications in a Docker container (and can it even be done)? The answer is simple, abstraction. On the desktop, you might think of abstraction more as a “sandbox.” By encapsulating any application in a Docker container, I am isolating it from the remainder of my desktop environment. I am also able to provide dependencies (perhaps outdated dependencies that may have functional or security implications for my overall environment if I had them in place for the entire operating system) that may conflict with the latest and greatest patches available for my distribution and environment.
Additionally, there are occasions where I may want/need to run an application that, for a variety of reasons, only runs on one specific distribution or an earlier (or later) version of my preferred distribution. The days of needing to keep full VMs or other systems available for those reasons are gone. Containers give me a method to run applications at a specific version and keep them completely isolated from everything else on my system. Finally, that isolation enables me to easily upgrade that application without affecting the underlying desktop host or to completely remove it without any complexities, package dependencies, broken links or configuration files left anywhere on the system. A quick ‘docker rm imagename’ and *POOF*, it is like it never existed and I am not at risk of breaking any other applications or packages that depended on anything it provided.
Simple Example – Lynx Web Browser
Let’s take a look at the “lynx” console web browser. All we need to do is pick a distribution for the base image and then install the application with the minimum necessary components. Once that is done, be sure that when it is started, the application itself will run. Create a Dockerfile that looks something like this:
# Lynx Container # Example Start Command: docker run --rm -it --name MyLynx example/mylynx github.com/example # FROM debian MAINTAINER Your Name RUN apt-get update && apt-get install -y lynx --no-install-recommends RUN rm -rf /var/lib/apt/lists/* ENTRYPOINT [ "lynx" ]
We then end up with a locally-installed base image that will allow the instantiation of container(s) that are responsible for running “lynx” when we call it. The advantage here is that anything we do inside this container is ephemeral. Meaning, once the container exits, it is all gone — no cookies, no traces, no history. We created a container for a specific task and then left nothing to clutter the filesystem afterwards.
GUI Applications as Well
Can we run desktop GUI applications through a Docker container? Glad you asked! We absolutely can, we just have to make sure that when we instantiate the container (assuming our current desktop is running a full blown Desktop Environment like KDE or Gnome), we pass some additional information to the startup command. Let’s take a look at what a Dockerfile would look like for running the Mesa Utilities “GLXGears” test application:
# Mesa Utilities GLX Gears # Example Start Command: docker run --rm -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=unix$DISPLAY --device /dev/dri example/glxgears # FROM debian MAINTAINER Your Name RUN apt-get update && apt-get install -y mesa-utils --no-install-recommends RUN rm -rf /var/lib/apt/lists/* ENV LIBGL_DEBUG verbose ENTRYPOINT [ "glxgears" ]
There! That wasn’t so hard and, again, we leave nothing on the system once the container is ended. We are able to provide the necessary pass through information so that the container has access to the needed devices and files in order to display its application on our screen. We even provide the environment variable to debug in the event there is an issue (which there may be if you are running a non-3d graphics card). The point is, now that you know how, you can create an image for any application that you want (Chrome, Skype, gedit, audacity, etc). Then, when an upgrade comes out, you upgrade the base image without having to completely upgrade your desktop environment (which you may not want to do or are unable to do). This makes trying out the latest and greatest (or sticking with something old and tried) possible without breaking everything else you know and love.
Sometimes it amazes me how a little creativity in Linux can allow you to apply a tool to solve something completely differently than you might otherwise. Containers (and Docker specifically) have allowed me to get rid of dozens of VMs running five or six distribution types and versions for one application, utility or tool or another. Taking advantage of the power of Docker on my desktop has been liberating! Leave some comments on the more creative ways that you have used Docker on your desktop.