This is because Windows and Linux desktops are fundamentally not color managed. Everyone for themselves - color management needs to be implemented in every single application. Microsoft created scRGB which could have brought system color management to Windows, but didn't go through with it.
> Wide-gamut is especially pointless. It is near certain that the colours will be stretched to the monitor gamut in an unmanaged way.
This is also the manufacturer's fault. Some wide gamut displays don't even have sRGB emulation, and pretty much every wide gamut display defaults to their native gamut even in 8-bit mode, which is virtually never the right thing to do. sRGB emulation naturally reduces contrast, which is generally already very poor in all but the highest end PC monitors. To add insult to injury, the 10-bit / HDR (these are technically independent, but generally coupled in monitors) mode is complete shit in virtually every PC monitor advertised with HDR support. So you spend money on a display device that is designed essentially in exact opposition to its capabilities and your needs.
(Naturally, reviewers tend to ignore all of these problems apart from HDR actually being pointless with the current state of PC monitor tech; many praise the "vibrant colors" this gives you. Of course, everyone looks like they got sunburn, but who cares. Vibrant! Vivid! Saturated! The reddest reds money can buy! The greyest blacks! The most washed out shadows! Amazing! 8/10! Recommend! Buy now through my affiliate link!)
> wide gamut display defaults to their native gamut even in 8-bit mode, which is virtually never the right thing to do.
Then why have a wide gamut display!?!
The whole point is that you have a greater capability. It should be on all the time, not just when doing "professional image editing" or whatever. There SHOULD be no downside!
Similarly with HDR -- it is literally a superset of SDR, so then why are there endless support form complaints about it causing issues when enabled!? It SHOULD just work! Instead, early versions in Windows 10 would shift the desktop by 1/2 a pixel and cause blurring. Or darken the desktop. Or more recently force everything to sRGB, including colour-managed applications light Adobe Photoshop or Lightroom.
The correct thing to do is for each display to always be running in native gamut mode. The whole concept of in-display colour space emulation is absurd[1]. Instead, the display should feed back its native gamut to the operating system, which should then take care of tone mapping via either software or the GPU hardware. This almost happens now. Displays have EDID metadata that include the "coordinates" of their colour primaries. Windows even picks this up! Aaaaand then ignores it, and even strips out the information in newer SDKs like Win UI 3, because... I don't even know.
[1] Ideally, GPUs should be doing tonemapping under OS control, but to avoid banding this would need 12-bit or even higher output to the display. This would take too much bandwidth, so instead displays do tone mapping using LUTs with as many as 14-bits. Except that these LUTs are 1D and control over them is totally broken...
Because wide gamut (and better than 100 cd/m²) displays have been around for more than a decade now -
(though IIRC none - aside for black & white medical monitors - had better than 8 bit per color in hardware until "HDR in screens" showed up - IIRC also the PlayStation 3 and some games had a 10 bit per color mode that caused a lot of compatibility problems for hardly any benefit ?)
- but (non-Apple) OS support has been abysmal until recently,
and you probably need to pay a technician to use a probe to calibrate your screen anyway, so only some work environments would bother to set them up correctly ?
----
[1] Dolby PQ only needs 12 bits for up to 10k cd/m² ?
There's "30 bit" (10 bpc) displays, which have been around for a fairly long time. These are not HDR, but usually native AdobeRGB with high bit depth. The way that works / used to work is that when an application uses a 30 bit surface, it still outputs an 8 bit image which travels through the Windows GUI pipeline and it's the graphics driver which replaces it with the real 10 bit image on scanout.
I don't think any of the current TN/IPS/etc. PC monitors with HDR have 10 bit panels. HDR is achieved generally through sheer imagination (most) and less commonly through more or less (rare) rough local dimming, not by actually having a panel capable of anything close to HDR contrast ratios.
Also, all of this is about liquid crystals (and the electronics controlling them), but cathode ray tubes, plasmas, and light emitting diodes have quite different characteristics...
I would be surprised if nobody had made yet (professional ?) non-CRT PC monitors capable of more than 256 values discrimination ?! (Not even Apple ?!? Or at least some super-expensive, but still commercial (= non-experimental) displays ?)
Also, I guess a similar benefit might be achieved by using more than 3 primaries : who was it already that used a 4th "yellow" subpixel in their (IIRC) diode displays ?
(Though it's still not clear to me why more displays aren't using the standard (at least in Charge Coupled Devices) 2x2 Bayer Filter with double green, rather than a 3x1 one ? Too much reliance on Windows' ClearType hack working properly ? But why in TVs too ??)
> I would be surprised if nobody had made yet (professional ?) non-CRT PC monitors capable of more than 256 values discrimination ?! (Not even Apple ?!? Or at least some super-expensive, but still commercial (= non-experimental) displays ?)
They exist, but it's limited to the high-end. E.g. Apple's XDR display has a 10-bit panel and FALD.
Reference-class monitors are generally of the "dual film" type, which essentially means that the panel is two LCDs on top of each other, one being used to control only brightness of a given pixel, and the other for brightness and color.
> (Though it's still not clear to me why more displays aren't using the standard (at least in Charge Coupled Devices) 2x2 Bayer Filter with double green, rather than a 3x1 one ? Too much reliance on Windows' ClearType hack working properly ? But why in TVs too ??)
Non-standard pixel layouts are common in OLEDs, e.g. RGBW, weird pyramids and un-even subpixel sizes (I'm assuming due to differing phosphor efficiencies). These all lead to poor text and UI clarity, as one would expect. (It's worth pointing out that OLEDs, being LEDs at their heart, have inherently lacking linearity which is why most of their brightness range is covered by digital modulation)
RGB subpixels require the fewest number of subpixels, which also means reduced brightness loss due to LCD structures. Going to Bayer would mean 33 % more pixels for the same display, except it's dimmer (increasing pixel pitch by 50 % horizontally does not make up for halving it vertically), more expensive to make and also dimmer because now you have two green dots per pixel, so they need to be half as bright, throwing away more of the backlight, and the drivers now need to perform with inhomogenous pixels -- without an obvious upside.
The reason color camera sensors tend to use Bayer filters is - I think - because green contributes most to perceived brightness, so doubling the sensor area for green means halving green's contribution to luma noise. This problem does not exist in displays.
> Wide-gamut is especially pointless. It is near certain that the colours will be stretched to the monitor gamut in an unmanaged way.
This is also the manufacturer's fault. Some wide gamut displays don't even have sRGB emulation, and pretty much every wide gamut display defaults to their native gamut even in 8-bit mode, which is virtually never the right thing to do. sRGB emulation naturally reduces contrast, which is generally already very poor in all but the highest end PC monitors. To add insult to injury, the 10-bit / HDR (these are technically independent, but generally coupled in monitors) mode is complete shit in virtually every PC monitor advertised with HDR support. So you spend money on a display device that is designed essentially in exact opposition to its capabilities and your needs.
(Naturally, reviewers tend to ignore all of these problems apart from HDR actually being pointless with the current state of PC monitor tech; many praise the "vibrant colors" this gives you. Of course, everyone looks like they got sunburn, but who cares. Vibrant! Vivid! Saturated! The reddest reds money can buy! The greyest blacks! The most washed out shadows! Amazing! 8/10! Recommend! Buy now through my affiliate link!)