Digital colour (and HDR)
So, you want to learn about digital colour, huh? Read on, brave traveller...
The existing standard
How does colour currently work? Well, theoretically there is a standard called sRGB. This represents colour across three 8-bit values, one for red, one for green, and one for blue. For example, here is a nice colour:
This colour should have 0 red, 174 green, and 255 blue. Or does it? How do I know that this colour is the same on my screen, as it is on your screen?
The photons of light emitted by your screen are probably not the same as from my screen.
Colour calibration
Turns out, not all screens are created equal. Thus, calibration is required to ensure adherence to specifications, such as sRGB. However, this is not a simple process: every individual screen that is manufactured, is slightly different and requires it's own unique calibration. If you're buying a 'professional' monitor, you'll probably see such statistics as "99% sRGB" and "ΔE<2" and maybe even "95% DCI-P3."
Let's dive into these claims a bit...
99% sRGB
Screens (henceforth 'monitor') weren't as capable as they are nowadays. Back when sRGB was new, monitors couldn't actually produce all the colours that sRGB physically defined. These days, sRGB is the bare minimum, and in fact most monitors exceed sRGB (quite significantly)!
These are known as 'colour spaces' (except for DeltaE - that's a measurement!)
Adobe RGB
The next step up from sRGB is Adobe RGB, created by... Adobe. It's a pretty marginal improvement, and overlooked nowadays (except for pictures / photography).
DCI-P3
This is becoming the modern standard for Wide Colour Gamut (WCG) monitors. It allows representation of a much wider range of colours than sRGB or Adobe RGB can. However, it's still quite hard to actually do it - only the best Mini-LED or OLED monitors can reach 99% coverage. Most professional monitors can do ~95%.
Rec.2020
This is the more professional colour space standard, and is also used by HDR standards - read on for more info about them. It's one of the biggest / widest colour space standards, bigger than DCI-P3.
DeltaE (ΔE)
How different are two colours? DeltaE is a measure of that, based on human perception of colour. If two colours are supposed to be the same, and are displayed identically, you'll get a DeltaE value of 0. A value of 3-5 would mean that two colours would look obviously different. A value of around 2 means that two colours would look different if carefully scrutinised. A value of 1 is pretty close to the limit of human perceptibility of colour - requiring a professional colourist in a studio environment to differentiate two colours.
Side note: how to use a calibrated monitor
So you've got yourself a ΔE>2 professional colour-calibrated monitor with 99% sRGB and 95% DCI-P3 coverage. What next?How do you use it? How do you use it properly?
What is calibration, really?
When you first get your monitor out, and plug it in, before you do anything, it is calibrated. But what is it calibrated to? Itself, of course! The natural capabilities and colour space of the monitor itself is a thing, which isn't necessarily the same as any industry-standard colour space, as previously mentioned.
If you want to use it for a specific colour space, you need to select that mode from the menus system of the monitor itself, such as changing the colour / display mode to sRGB, or DCI-P3. These may or may not change the brightness of the monitor, as a brightness is specified in the standard, but not all manufacturers follow it.
The next step is to use the included USB memory stick (or USB cable) and install the appropriate ICC profile for your monitor.
ICC profile
Even after buying your calibrated monitor, and switching to a specific colour space display mode, it might not be perfectly accurate. There are still unit-to-unit differences - that's where the specific profile comes in. You may notice that your monitor is only 99% (or 95%) capable of displaying the colour space, and that (along with the unit-specific differences) is what the ICC profile corrects for. Only once you've got your monitor in the correct colour space mode, with the profile installed (and used by your OS), using a colour-managed app, can you be sure that what you see is accurate to the standard, within a DeltaE of 2 (or whatever is advertised).
Once you've got to that point, you're done. Your monitor probably isn't nice to look at anymore, but at least it's accurate! You'll likely find yourself switching between the 'Standard' and 'sRGB' / 'DCI-P3' modes depending on whether you need accuracy, or enjoyment, as appropriate.
10-bit colour
You might have caught that I tossed another term in the previous section: Wide Colour Gamut (WCG). It turns out that 8-bits for each colour component isn't really enough. Sure, that still allows for 16.7 million different colour values, but that also includes luminance values! What's the big deal? There are situations where you want to have a dark scene, with changing colours - but that limits you to a much smaller range of shades.
As our system of defining colours also includes luminance, there isn't quite enough resolution to exceed the human eye's capability of differentiating colours. Solution: add on 2 more bits!
With 10-bits for each colour component (sometimes called 30-bit colour), we have a range from 0 to 1023 for each of red, green, and blue. Paired with a monitor capable of displaying all these values, it's about enough (or a decent step towards) to show all the colours that our eyes can see.
HDR
Woah, what's luminance? In more layman's terms, it's how 'bright' something appears. Just how a lamp is brighter than your phone torch, which in turn is brighter than candlelight. Luminance is just a fancy way of talking about how brightly something 'shines,' so as not to confuse it with gaudily-bright a colour may look.
As well as improving the colour representation by using DCI-P3 instead of sRGB, adding a few extra bits also allows us to improve the representation of luminance! This is collectively known as 'High Dynamic Range,' after the metric it is supposed to improve.
Dynamic range
Dynamic range is the difference between the darkest shade, and the brightest shade that is displayed / represented. All standards use a minimum of 0 nits - you can't have negative light!
sRGB specifies a maximum luminance of 80 nits, but outside of cinema, nobody actually uses it like that. In practice, sRGB is a 'Standard Dynamic Range' (SDR) colour standard, and SDR uses a maximum luminance of 250 nits.
HDR standards
This is where things get a bit more confusing, because there are separate distinctions for digital standards, and physical standards, and they all relate to HDR in consumer-confusing ways.
HDR10
This is an open, basic HDR digital standard, mostly used by computers. It specifies a technical limit of 10,000 nits of luminance, but most things are created up to 1,000 or 4,000 nits.
It is a technical standard that ensures the display pipeline from graphics card to Operating System to drivers to display are all compatible and understanding each other, without making any promises of the resulting experience.
Dolby Vision
This is a proprietary standard, which is designed and used by specific media content and streaming services for movies and TV series. Dolby Vision also has a technical limit of 10,000 nits, with most content created between 1,000 to 4,000 nits. This is what you'll use / see with game consoles and TVs.
This specifies strict requirements for the devices that are certified, to ensure that they behave in a predictable way that a filmmaker can rely on when editing the film for HDR display.
DisplayHDR
For the consumer monitor space, VESA created the DisplayHDR range of specifications. They guarantee a maximum luminance level, as well as a number of other quality metrics. All DisplayHDR monitors support HDR10 as their method of communicating HDR signals. Other standards may be optionally supported.
- DisplayHDR 400 - the lowest of the low, 'technically' HDR (but not really)
- DisplayHDR 500 - not really any better, probably skip it
- DisplayHDR 600 - the first level where you can start to experience actual HDR
- DisplayHDR 1000 - a good experience, where you'll thoroughly enjoy HDR content (this is what most HDR is targeted for)
- DisplayHDR 1400 - the best of the best, you need a special monitor for this one!
If there is a 'True Black' designation, it's the same as without it, but it's an OLED monitor. Depending on your preferences, this might be a better experience than an equivalent (or higher) regular DisplayHDR level.
SDR -> HDR?
What about displaying SDR content on an HDR display? This is one of the (many) challenges that HDR faces, especially because this isn't a simple technical issue - it's a matter of opinion. What can we do?
Windows
Microsoft's approach (along with various / most other companies and products) is to 'stretch' the 0-255 SDR range to 0-1023, to fit amongst HDR content, and to match the monitor's capabilities. SDR content wasn't created with this extreme level of luminance in mind though, so it can often look a bit wrong. Windows has a special settings slider to allow for user adjustment of this 'stretching' process.
macOS
Apple's approach is to limit the display instead: SDR content is displayed as-is, HDR content is displayed in HDR. This can make SDR content look really life-less and unappealing next to HDR content, or just next to a monitor that displays SDR content to it's full capacity. The upside is that the content is displayed exactly as it was created (and intended?) to be displayed.
What's the best approach?
Well, this is where it's a matter of personal preference and opinion. Problem: this is really technical, and consumers don't know about it. Other problem: consumers are idiots and go for whatever is more vibrant / brighter, every time. Getting them to care about quality is nearly impossible, hence most things have just gone with the Windows way.
HDR -> SDR?
What about the other way around? How do we scale a HDR content back down to an SDR monitor? Either which way your try to do it, you'll be throwing away information...
Clip
The simple approach is to 'clip' the values. Anything that's greater than 255 is limited to 255, and everything else is passed on. This is generally understood to be really bad, and it looks really bad, too.
Tonemap
This 'scales' the values down, appropriately, such that a value of 1023 becomes 255, and a value of 512 becomes 128, and so on. The problem with this is that our eyes are not linear, and displays are not linear, and they're un-linear in different and unique ways. Trying to scale things down and have them look 'proper' (or even just 'good') is a really hard thing to do. The preference is to do a semi-automatic scale down, and then apply a few tweaks overseen by a human - but this needs to be done for all content, individually!
What's the best approach?
Make two versions of everything, one in HDR, and one in SDR. Turns out that's not always practical, so a suitable approach of accurately scaling down HDR to SDR, whilst maintaining the intended creative and aesthetic balance of colours / tones is probably the best compromise. Technical implementations of this are... lacking.
HDR -> lesser HDR?
This faces the same issues as trying to fit HDR into SDR, although the argument is slightly different.
Clip
This is again the simple approach, which preserves accurate tones and balance... up to the clipping point (the maximum capabilities of the monitor). Unforunately, whilst darker tones are fine, anything brighter than the monitor's capcity is totally lost.
Tonemap
This brings all the bright 'highlights' back down to within what the monitor can display, so information isn't totally lost. The flip side is that the darker parts are now much darker, and their balance (and the balance between the dark and bright parts) are affected, which could ruin the viewing experience, and certainly deviates from the intended viewing experience of the content.
What's the best approach?
Just as before, there is no 'right' way of doing it - it's a compromise either way, and not a good one. Ideally, the content could be adapted in some way to various monitor capabilities (when it is created), but this just adds more work to an already tricky process. Once again, this should probably be a user-selected option, but you can't really trust them to know what they're doing - and there is no real guidance to give, either. It's a lose-lose situation.