17 FEB, 2023

Linux hardware accelerated video: absolute shambles

You'd think that after so many years, the bugs and issues of software will have been worked out so that hardware 'Just Works'™.

What's the situation?

I have a Dell XPS 15 9560 laptop. It's amazing - when it works. Unfortunately, there's a few things that don't work with Ubuntu 22.04.

Small annoyances

The webcam (or perhaps nosecam?) doesn't work. The access light illuminates, but Cheese notes an arbitray error message. I don't know why, it works fine for other people.

The fingerprint reader doesn't quite work. It is detected, I can reliable enroll a fingerprint, all is well - almost. I cannot use it to login. It fails authentication every time, despite setting it up mere seconds prior. Other users also report it working fine, I don't know why it doesn't work.

The real problem

You may have heard of a little website called YouTube. It hosts videos for online playback in a web browser. This is best done using the dedicated video decoding hardware included with every modern Graphics Processing Unit (GPU). However, this process is not seamless (it really should be!).

The XPS 15 9560 has two GPUs: the Intel HD 630 Graphics as integrated into the Intel Core i7-7700HQ Central Processing Unit (CPU), and the NVIDIA GTX 1050 Mobile.

They are both equally capable of performing video decoding and playback functions, with the integrated Intel GPU being a bit more efficient (albeit less capable). I initially followed the usual setup steps that experience convinced me of doing:

  1. Install proprietary NVIDIA drivers
  2. Follow an up-to-date online guide for enabling hardware-accelerated video decode in Google Chrome

Although the guide was a year old and partially outdated, I managed to follow it well enough that things worked, and the dedicated NVIDIA GPU was used to decode and play videos. The device got quite warm during this, but it was noted on Wikipedia that this model suffers thermal issues, and it wasn't entirely unexpected or dangerously hot. Until, one day, it stopped working.

The task of decoding video was now relegated to a software solution, executed by the CPU, which could only just handle 480p video playback.

I tried many things to re-enable this, see below for my findings and eventual solution...

The situation

Firstly, there is an excellent article for Arch Linux (also somewhat applicable to Ubuntu) that details several key things, partially replicated here.

How does Chrome talk to drivers talk to hardware?

There are three major GPU vendors, each have their own drivers, of which you may have to choose between several implementations.

There are two major Application Programming Interfaces (APIs) for software to manage video processing.

This leads to a complicated matrix of supported options and configuration flags, of which only some work properly. Let's put together some tables of what works.

What does the hardware drivers support?

API Intel AMD NVIDIA
VA-API ⚠️
VDPAU ⚠️⚠️

⚠️ Using the open-source nouveau driver only, and only older GPUs
⚠️⚠️ Using NVIDIA's driver only

Intel

Driver VA-API VDPAU Notes
i965-va-driver-shaders This is for older generations of integrated GPUs
intel-media-va-driver This is the newer driver, alternatively use -non-free for encode support

AMD

Driver VA-API VDPAU Notes
mesa-va-drivers For Radeon HD 2000 series and newer
mesa-vdpau-drivers For Radeon R300 series and newer
amf-amdgpu-pro For Radeon Fiji series and newere - doesn't seem to exist for Ubuntu

NVIDIA

Driver VA-API VDPAU Notes
nouveau Requires the mesa-va-drivers and mesa-vdpau-drivers packages to work, apparently
NVIDIA For NVIDIA 8 series and newer until the GTX 750

The winning combination

API Intel driver NVIDIA driver
VA-API with mesa-va-drivers intel-media-va-driver-non-free nouveau

Google Chrome

Software VA-API VDPAU
Google Chrome

That's not all, folks! Google Chrome also requires some configuration to work properly!

  1. Copy the desktop launcher to your local directory

    cp /usr/share/applications/google-chrome.desktop ~/.local/share/applications/google-chrome.desktop
  2. In ~/.local/share/applications/google-chrome.desktop, modify the Exec= line to be:

    Exec=/usr/bin/google-chrome-stable --enable-features=VaapiVideoDecoder --disable-features=UseChromeOSDirectVideoDecoder %U
  3. In chrome://flags, set to Enabled as follows:

Flag Essentiality Notes
#ignore-gpu-blocklist Must It should be clear that this whole exercise is a mess and that's why it's not the default
#enable-gpu-rasterization May May improve performance by using the GPU for rastering
#enable-zero-copy May May improve performance due to the memory architecture of an integrated Intel GPU
#canvas-oop-rasterization May May improve performance by using the GPU for rastering canvas objects
#enable-drdc Might May improve performance by scheduling compositing to happen on a different GPU thread than WebGL, video, and rasterization
#ui-enable-shared-image-cache-for-gpu Might May improve performance due to the memory architecture of an integrated Intel GPU
#enable-vp9-kSVC-decode-acceleration Might I don't quite know what it does, may improve compatibility with YouTube videos?