MODELLING AND SIMULATION, WEB ENGINEERING, USER INTERFACES
August 4th, 2012

Thinkpad W520 Multi-Monitor nVidia Optimus with Bumblebee on Ubuntu 12.04

Last night I decided to upgrade from Ubuntu 11.10 to 12.04 on my Thinkpad W520. The main reason for this was that my current setup was making suboptimal use of the hardware, and due to recent advances which I found documented in several blog posts, it seemed I could improve this situation. The goal of this post is then to document I hoped to achieve, and how I arrive there, so that in the future I’ll be able to remember what the heck I did to set this all up.

Project Goals

I purchased the Thinkpad W520 back in November, because my fanless Mini Inspiron netbook kept overheating when I left it to run performance benchmarks related to my research. Ubuntu 11.10 worked pretty well on the W520 out of the box, but there were two major outstanding compatibility issues.

First, the W520 comes with nVidia Optimus graphics. In this setup, the laptop has a discrete nVidia card and an on-board Intel graphics card, and the operating system is able to enable and disable the nVidia card in software in order to save power. nVidia has explicitly stated that they will not support Optimus in Linux, which six months ago meant that there were only two options for Linux users: enable only discrete graphics or integrated graphics in the BIOS.

When the Intel graphics were enabled in the BIOS, the open source Intel integrated graphics drivers worked like a dream – 3D acceleration, flawless suspend/resume support, and everything was just a superb, rock-solid experience. The battery life was also excellent. For someone like me who mostly uses the laptop to write software and does not care about 3D acceleration, this would have been an ideal choice, except for one major flaw, which is that the external display ports on the W520 (VGA and DisplayPort) hardwired to the nVidia card, so using an external monitor is impossible when only Intel Graphics are enabled in the BIOS. I use an external monitor at home, and so this meant Intel graphics were a nonstarter for me.

As it was not possible to use Optimus or Intel graphics, this left me with only one choice, which was to use the nVidia graphics. This process went something like this:

  1. Tried the nouveau driver. This worked pretty well, but would hang X on suspend/resume. Solid suspend/resume support is a must-have, so I eliminate this option
  2. Tried to install the nVidia binary driver in Ubuntu using the nice graphical interface (jockey-gtk). Ultimately, this did not work. Uninstalled the binary driver using jockey-gtk.
  3. Tried to install the nVidia binary driver by running the Linux installer shell script from nVidia. This felt evil, because you have no idea what the script is doing to your system, but everything installed correctly, and after a reboot, the laptop finally had working graphics.

The binary nVidia drivers were pretty solid: 3D acceleration, multi-monitor support, suspend/resume, and VDPAU video acceleration all worked great. My laptop had an uptime of several months under this configuration. Unfortunately, however, battery life was pretty poor, clocking in at about 3 hours.

Furthermore, and more seriously, the laptop firmware has a bug where Linux would hang at boot when both VT-x (Intel hardware vitalization technology) and nVidia graphics were enabled in the BIOS. This was pretty annoying, as I tend to run Windows in a VM in Virtualbox on Linux for testing compatibility with different versions of Internet Explorer. I believe this bug is now being tracked by Linux kernel developers, who are working around this issue by disabling X2APIC on boot, but Lenovo has refused to fix this bug, or acknowledge its existence. Not cool, Lenovo.

This meant that it would not be possible to have both working multi-head support, and reasonable battery life and VT-x support. Not optimal.

Bumblebee

Bumblebee is a project to bring support for nVidia Optimus to Linux. It basically renders a virtual X server on the nVidia card, and then passes the buffer to the Intel card which dumps it to the screen. Apparently, this is pretty much how Optimus works on Windows as well.

The advantage to using Bumblebee is that, theoretically, you would be able to have the excellent battery life of the Intel graphics, but also have 3D acceleration and multi-monitor support from the nVidia graphics.

I tried bumblebee 6 months ago, but was unable to get it to work. The project had also been forked around that time, and it wasn’t clear which fork to follow.

However, the following blog posts led me to believe that the situation had changed, and a multi-monitor setup could be achieved using Ubuntu 12.04 and Bumblebee:

I decided to see if I could get this to work myself, and ultimately I was successful. My current setup is now as follows:

  • Ubuntu 12.04 x64
  • Optimus Graphics and VT-x enabled in BIOS
  • External monitor, which can be enabled or disabled on-demand, and works consistently after suspend/resume
  • Bumblebee set to use automatic power switching, so the nVidia card is disabled when not in use.
  • Xmonad and Unity2D desktop environment

The remainder of the blog post documents the process I went through in order to obtain this optimal setup.

Multi-Monitor Support with Optimus and Bumblebee on Ubuntu 12.04

I primarily followed the process described on Sagar Karandikar’s blog, up to, but not including, his changes to /etc/bumblebee/bumblebee.conf.

Sagar says to set the following parameters in /etc/bumblebee/bumblebee.conf

And then enable the external monitor as follows:

As far as I understand it, these parameters set in /etc/bumblebee/bumblebee.conf tell Bumblebee to use the nVidia proprietary driver (Driver=nVidia), keep the nVidia card turned on (PMMethod=none disables bumblebee power management), and perpetually run an X server (KeepUnusedXServer=true). Clearly this setup would have negative implications for battery life, as the nVidia card is kept on and active.

optirun true then should turn on the nVidia card and output to the external monitor. xrandr tells the X server where to put the virtual display, and screenclone clones the X server running on display :8 (the X server being run by Bumblebee on the nVidia card) to the Intel virtual display.

I found that this technique was really finicky. optirun true would enable the external display right after rebooting, but would often not enable the display in other situations, such as after a suspend/resume cycle. It wasn’t clear how to bring the nVidia card back into a good state where it could output to an external monitor.

At this point, I read a comment by Gordin on Sagar’s blog, and his blog post. In this post, he describes using bbswitch for power management in bumblebee.conf, and running the second display using optirun screenclone -d :8 -x 1. This has the advantage of: a) enabling power management on the nVidia card, so it is turned off when not in use, and b) seemingly increased reliability, as the nVidia card will be enabled when screenclone is run, and disabled when the screenclone process is terminated. Based on these instructions, I came up with the following adapted solution.

Set the following settings in /etc/bumblebee/bumblebee.conf:

The following shell script will enable the external monitor. ^C will disable the external monitor:

This setup now works great, although screenclone does behave a bit strangely sometimes. For example, when switching workspaces, screenclone may require you to click in the workspace on the second desktop before it updates its graphics there. There were a few other minor quirks I found, but ultimately it seems like a solid and reliable solution.

Desktop Environment: Xmonad and Unity2D

I like Unity mostly because it provides a good global menu, but I find most other parts of it, including window management and the launcher, to be clunky or not very useful. Furthermore, certain Ubuntu compiz plugins that would improve the window management, such as the Put plugin, seem to be completely broken on Ubuntu out of the box:

I therefore set up my desktop environment to use the Unity2D panel and the Xmonad window manager. I primarily followed this guide to set this up: http://www.elonflegenheimer.com/2012/06/22/xmonad-in-ubuntu-12.04-with-unity-2d.html

The only change I made was to /usr/bin/gnome-session-xmonad. I’m not sure why, but xmonad was not getting started with the desktop session. I therefore started it in the background in the /usr/bin/gnome-session-xmonad, along with xcompmgr, a program which provides compositing when running non-compositing window managers like Xmonad. xcompmgr allows things like notification windows to appear translucent.

For a launcher, I’m now trying out synapse, which can be set to run when the gnome session is started.

  • Ricky Goldsmith

    Did vdpau acceleration ever work for you, for mplayer/vlc ?

    I have tried all means (Discrete mode, Optimus mode). None of them work.

    Following your blog, I can get a virtual display. I also made my virtual display my primary display, to try and see if I can get vdpau decode from my video players. Still no luck. :-(

    Is there a way I could simply run mplayer/vlc with vdpau and export that display to :8 and let screenclone clone the output to display :0.0 ?

  • Piotr Kołaczkowski

    This doesn’t work with resolutions higher than 1920×1600. The virtual XCRTC screen or the Intel chip seem to not support them.

  • AmirY

    Do you have any updates on getting it to work on 13.04 (I’m on Kubuntu)?

This work is licensed under GPL - 2009 | Powered by Wordpress using the theme aav1