Not just the web: PC colour is totally and utterly broken. The web is just a small part of this.
There's an entire field of colour management that Microsoft, Linux, and Google are carefully ignoring. They'll occasionally stumble upon ICC colour spaces, HDR, or 10-bit, but they make sure to break everything even worse and leave it like that forever.
Sigh... I have gone on the same rant annually since about 2010, starting on Slashdot. Most recently on YCombinator News in 2021. It's a whole new year, time to repeat my rant and pray to the IT gods that someone at Microsoft or Google stumbles upon this:
The current year is 2022. The future. We have these amazing display technologies such as OLED, HDR, and quantum dots. In this sci-fi fantasy world I cannot do any of the following:
- Send a photo in any format better than an SDR sRGB JPEG in the general case, such as an email attachment, document, or chat message. These are 30 year old standards, by the way.
- Send a photo in any format and expect colour reproduction to be even vaguely correct. Wide-gamut is especially pointless. It is near certain that the colours will be stretched to the monitor gamut in an unmanaged way. People will look either like clowns or zombies depending on the remote display device, operating system, software, and settings.
- Send a photo in 10-bit and expect an improvement in image quality when displayed.
- Expect any industry-wide take-up of any new image encoding format. It is a certainty that each vendor will do their "own thing", refuse to even acknowledge the existence of their competitors, and guarantee that whatever they come up with will be relegated to the dustbin of history. Remind me... can ANY software written by Microsoft or Google save a HEIF/HEIC file? No... because it's an "Apple" format. Even most Microsoft software can't read or write their own JPEG-XR format, let alone Google or Apple. Netflix developed AVIF but I'm yet to see it taken up by any mainstream system. Etc...
New in 2022:
- Display HDR on Linux.
- Display HDR even vaguely correctly on Windows. I mean, good try, but simply clipping highlights without even attempting to implement tone mapping is just plain wrong.
- Use a colour calibrator device on Windows 11, which totally broke this functionality.
- Use a colour calibrator device for calibrating a HDR display. I have a device that can measure up to 2000 nits. Can I use this calibrate my HDR OLED laptop display? No. That's not an option.
- Turn on HDR in Windows 11 and not have it cause a downside such as restricting the display gamut to sRGB.
- Use HDR in any desktop application for GUI controls.
- Use colour management for any desktop application for GUI controls that isn't an Electron application -- the only platform that does this vaguely correctly under Windows by default.
Things have even regressed over the last few years! Windows 10 Photo viewer could do colour management -- badly. It would show an unmanaged picture at first, and then overwrite it with the colour managed picture a bit later, so you get colour flickering as you scroll through your photos. Okay, fine, at least it was trying. Windows 11 does not try. It just assumes 8-bit sRGB for everything.
Similarly, the 14 year old WPF framework supported wide gamut, 16-bit linear compositing, and HDR. The "latest & greatest" Win UI 3 framework... 8-bit SDR sRGB only.
This is because Windows and Linux desktops are fundamentally not color managed. Everyone for themselves - color management needs to be implemented in every single application. Microsoft created scRGB which could have brought system color management to Windows, but didn't go through with it.
> Wide-gamut is especially pointless. It is near certain that the colours will be stretched to the monitor gamut in an unmanaged way.
This is also the manufacturer's fault. Some wide gamut displays don't even have sRGB emulation, and pretty much every wide gamut display defaults to their native gamut even in 8-bit mode, which is virtually never the right thing to do. sRGB emulation naturally reduces contrast, which is generally already very poor in all but the highest end PC monitors. To add insult to injury, the 10-bit / HDR (these are technically independent, but generally coupled in monitors) mode is complete shit in virtually every PC monitor advertised with HDR support. So you spend money on a display device that is designed essentially in exact opposition to its capabilities and your needs.
(Naturally, reviewers tend to ignore all of these problems apart from HDR actually being pointless with the current state of PC monitor tech; many praise the "vibrant colors" this gives you. Of course, everyone looks like they got sunburn, but who cares. Vibrant! Vivid! Saturated! The reddest reds money can buy! The greyest blacks! The most washed out shadows! Amazing! 8/10! Recommend! Buy now through my affiliate link!)
> wide gamut display defaults to their native gamut even in 8-bit mode, which is virtually never the right thing to do.
Then why have a wide gamut display!?!
The whole point is that you have a greater capability. It should be on all the time, not just when doing "professional image editing" or whatever. There SHOULD be no downside!
Similarly with HDR -- it is literally a superset of SDR, so then why are there endless support form complaints about it causing issues when enabled!? It SHOULD just work! Instead, early versions in Windows 10 would shift the desktop by 1/2 a pixel and cause blurring. Or darken the desktop. Or more recently force everything to sRGB, including colour-managed applications light Adobe Photoshop or Lightroom.
The correct thing to do is for each display to always be running in native gamut mode. The whole concept of in-display colour space emulation is absurd[1]. Instead, the display should feed back its native gamut to the operating system, which should then take care of tone mapping via either software or the GPU hardware. This almost happens now. Displays have EDID metadata that include the "coordinates" of their colour primaries. Windows even picks this up! Aaaaand then ignores it, and even strips out the information in newer SDKs like Win UI 3, because... I don't even know.
[1] Ideally, GPUs should be doing tonemapping under OS control, but to avoid banding this would need 12-bit or even higher output to the display. This would take too much bandwidth, so instead displays do tone mapping using LUTs with as many as 14-bits. Except that these LUTs are 1D and control over them is totally broken...
Because wide gamut (and better than 100 cd/m²) displays have been around for more than a decade now -
(though IIRC none - aside for black & white medical monitors - had better than 8 bit per color in hardware until "HDR in screens" showed up - IIRC also the PlayStation 3 and some games had a 10 bit per color mode that caused a lot of compatibility problems for hardly any benefit ?)
- but (non-Apple) OS support has been abysmal until recently,
and you probably need to pay a technician to use a probe to calibrate your screen anyway, so only some work environments would bother to set them up correctly ?
----
[1] Dolby PQ only needs 12 bits for up to 10k cd/m² ?
There's "30 bit" (10 bpc) displays, which have been around for a fairly long time. These are not HDR, but usually native AdobeRGB with high bit depth. The way that works / used to work is that when an application uses a 30 bit surface, it still outputs an 8 bit image which travels through the Windows GUI pipeline and it's the graphics driver which replaces it with the real 10 bit image on scanout.
I don't think any of the current TN/IPS/etc. PC monitors with HDR have 10 bit panels. HDR is achieved generally through sheer imagination (most) and less commonly through more or less (rare) rough local dimming, not by actually having a panel capable of anything close to HDR contrast ratios.
Also, all of this is about liquid crystals (and the electronics controlling them), but cathode ray tubes, plasmas, and light emitting diodes have quite different characteristics...
I would be surprised if nobody had made yet (professional ?) non-CRT PC monitors capable of more than 256 values discrimination ?! (Not even Apple ?!? Or at least some super-expensive, but still commercial (= non-experimental) displays ?)
Also, I guess a similar benefit might be achieved by using more than 3 primaries : who was it already that used a 4th "yellow" subpixel in their (IIRC) diode displays ?
(Though it's still not clear to me why more displays aren't using the standard (at least in Charge Coupled Devices) 2x2 Bayer Filter with double green, rather than a 3x1 one ? Too much reliance on Windows' ClearType hack working properly ? But why in TVs too ??)
> I would be surprised if nobody had made yet (professional ?) non-CRT PC monitors capable of more than 256 values discrimination ?! (Not even Apple ?!? Or at least some super-expensive, but still commercial (= non-experimental) displays ?)
They exist, but it's limited to the high-end. E.g. Apple's XDR display has a 10-bit panel and FALD.
Reference-class monitors are generally of the "dual film" type, which essentially means that the panel is two LCDs on top of each other, one being used to control only brightness of a given pixel, and the other for brightness and color.
> (Though it's still not clear to me why more displays aren't using the standard (at least in Charge Coupled Devices) 2x2 Bayer Filter with double green, rather than a 3x1 one ? Too much reliance on Windows' ClearType hack working properly ? But why in TVs too ??)
Non-standard pixel layouts are common in OLEDs, e.g. RGBW, weird pyramids and un-even subpixel sizes (I'm assuming due to differing phosphor efficiencies). These all lead to poor text and UI clarity, as one would expect. (It's worth pointing out that OLEDs, being LEDs at their heart, have inherently lacking linearity which is why most of their brightness range is covered by digital modulation)
RGB subpixels require the fewest number of subpixels, which also means reduced brightness loss due to LCD structures. Going to Bayer would mean 33 % more pixels for the same display, except it's dimmer (increasing pixel pitch by 50 % horizontally does not make up for halving it vertically), more expensive to make and also dimmer because now you have two green dots per pixel, so they need to be half as bright, throwing away more of the backlight, and the drivers now need to perform with inhomogenous pixels -- without an obvious upside.
The reason color camera sensors tend to use Bayer filters is - I think - because green contributes most to perceived brightness, so doubling the sensor area for green means halving green's contribution to luma noise. This problem does not exist in displays.
AVIF probably has potential, as it's part of the AV1 - once we get AV1 hardware acceleration, hopefully AVIF will become the default picture format too ?
There's an entire field of colour management that Microsoft, Linux, and Google are carefully ignoring. They'll occasionally stumble upon ICC colour spaces, HDR, or 10-bit, but they make sure to break everything even worse and leave it like that forever.
Sigh... I have gone on the same rant annually since about 2010, starting on Slashdot. Most recently on YCombinator News in 2021. It's a whole new year, time to repeat my rant and pray to the IT gods that someone at Microsoft or Google stumbles upon this:
The current year is 2022. The future. We have these amazing display technologies such as OLED, HDR, and quantum dots. In this sci-fi fantasy world I cannot do any of the following:
- Send a photo in any format better than an SDR sRGB JPEG in the general case, such as an email attachment, document, or chat message. These are 30 year old standards, by the way.
- Send a photo in any format and expect colour reproduction to be even vaguely correct. Wide-gamut is especially pointless. It is near certain that the colours will be stretched to the monitor gamut in an unmanaged way. People will look either like clowns or zombies depending on the remote display device, operating system, software, and settings.
- Send a photo in 10-bit and expect an improvement in image quality when displayed.
- Expect any industry-wide take-up of any new image encoding format. It is a certainty that each vendor will do their "own thing", refuse to even acknowledge the existence of their competitors, and guarantee that whatever they come up with will be relegated to the dustbin of history. Remind me... can ANY software written by Microsoft or Google save a HEIF/HEIC file? No... because it's an "Apple" format. Even most Microsoft software can't read or write their own JPEG-XR format, let alone Google or Apple. Netflix developed AVIF but I'm yet to see it taken up by any mainstream system. Etc...
New in 2022:
- Display HDR on Linux.
- Display HDR even vaguely correctly on Windows. I mean, good try, but simply clipping highlights without even attempting to implement tone mapping is just plain wrong.
- Use a colour calibrator device on Windows 11, which totally broke this functionality.
- Use a colour calibrator device for calibrating a HDR display. I have a device that can measure up to 2000 nits. Can I use this calibrate my HDR OLED laptop display? No. That's not an option.
- Turn on HDR in Windows 11 and not have it cause a downside such as restricting the display gamut to sRGB.
- Use HDR in any desktop application for GUI controls.
- Use colour management for any desktop application for GUI controls that isn't an Electron application -- the only platform that does this vaguely correctly under Windows by default.
Things have even regressed over the last few years! Windows 10 Photo viewer could do colour management -- badly. It would show an unmanaged picture at first, and then overwrite it with the colour managed picture a bit later, so you get colour flickering as you scroll through your photos. Okay, fine, at least it was trying. Windows 11 does not try. It just assumes 8-bit sRGB for everything.
Similarly, the 14 year old WPF framework supported wide gamut, 16-bit linear compositing, and HDR. The "latest & greatest" Win UI 3 framework... 8-bit SDR sRGB only.