The quality of computer monitors is appalling - after buying and returning several monitors because of quality issues, I briefly considered doing this system-wide to fix colour uniformity problems. Eventually I chose to keep my old monitor instead. It has uniformity problems as well but it doesn't cost me $1000 extra to keep.
But it's a damn shame that you can't throw enough money at manufacturers to make them make monitors without glaring QA problems. No matter how much you spend they sell you shit.
You talking about mainstream "gaming" monitors? I'm just a programmer and gave up on those long ago. Getting entry level monitors targeted at graphic designers now. Namely Asus ProArt and I think Dell has similar stuff?
They don't have 240 Hz and sub 0.01 ms response times though, so if you're buying your hardware based on bigger numbers in specs they won't do.
They're probably not that great for actual designers either, but they're good enough for me.
I’m using one of these and I’m very happy with it. Reasonable price, 75Hz, supports USB-PD + has its own USB ports so I can one-cable it with my work laptop.
Most importantly they come factory calibrated. I consider reasonable colour reproduction important even though I only use it for programming. I stare at this thing for 8 hours a day, it needs to look good.
In fairness I also have a 165Hz LG UltraGear gaming monitor, and the image quality is almost as good. My only complaint is the black levels and grey uniformity suck, but for someone who wants performance and quality it’s a decent option.
The first graphic design monitor I bought from Benq had a busted image processor so that you couldn't turn off the sharpening - only set it to -5% (blurry) or +5%. Eventually I complained enough that they sent me a different monitor that didn't have the problem.
I've had bad experiences with some of Dell's pro-grade monitors too. It feels like modern displays are so complex firmware and hardware wise that it's just very hard to find one that isn't defective in some way. This replacement Benq works for basic uses but its freesync is broken and it's already developed burn-in around the edges after about 1.5 years.
My XDR has been pretty flawless, and on the less exorbitantly priced end, I've had good experiences with a few business oriented IPS models (most recently the Samsung UR55).
My main gripe is gaming monitors seem to be consistently the worst panels they can get their hands on.
It seems like they realize gamers will put up with a lot of garbage in exchange for raw "power" and take full advantage of the fact. I'm 99% sure that's why we saw brands like Wasabi Mango (who used to take B grade panels and sell them on the cheap) disappeared... the manufacturers just started shipping them as gaming models.
Gamers hit up blurbusters to see how the motion is and it either looks okay or you get a headache. The OEMs optimize for that (hence the focus on g2g and adaptive sync) and then just jack up the saturation slider to compensate for everything else.
And hey, those wasabis and catleaps got you a 1440p IPS panel that did 90% of what you want for 50% of the price at a time when 1440p and IPS was still kind of rare to own. Most people who got one were upgrading from a typical TN so even a crappy IPS looks good in comparison. I was playing eve online at the time and caused at least 10 people in my corp to buy them when they saw how much screen estate you got at a higher res.
You're misunderstanding the comment: the point is Wasabi Mango and co were good because they were charging significantly less for the lower grade panels.
Now manufacturers are possibly prioritizing the highest grade panels for non-gaming use and using extremely expensive gaming monitors as a dumping ground for everything else.
For example, the 28" UR55 has few complaints about backlight bleed and in my experience with having bought several is a reliable choice. Meanwhile the oddly similar 28" Odyssey G8 is known as a "buy and return until you get one that's ok" type of monitor, as are many other gaming monitors these days.
Gamers seem conditioned to just accept inferior panel quality as long as the other specs work, while business and casual customers would probably just buy another monitor if they saw weird issues. They might not know the term backlight bleed, so they'll still see it just fine.
If you fill your entire screen with nothing but a single solid color (try #FF6400) does it show up correctly? That is, without any gradient or areas of the screen where the color appears darker or lighter (especially around the edges or in the corners?).
I've yet to find a modern monitor that doesn't have a problem with that basic test, which is pretty disappointing considering accurately representing a single color should be easy and I've had several CRTs that could do it.
I use a pair of Dell monitors with IPS screens (U2720Q and UP2414Q).
I always use as background a solid grey (#808080) and there is no noticeable non-uniformity.
I have tried now your color (#FF6400) on the U2720Q. Because this color is much brighter, if you look carefully you can see that there are small areas at the corners, especially at the 2 lower corners, with lower brightness. Also the 2 lateral edges have a slightly lower brightness, but the difference from the center is less visible than for the 2 lower corners.
However the areas affected are small (maybe a width of about 1/30 or 1/40 of the screen width) and you really have to look with the intention to find non-uniformities. When looking casually at the screen there is no obvious non-uniformity.
For emissive displays like CRT or OLED it is easier to achieve uniform brightness over the screen.
On my main monitor, a test like that fails spectacularly.
If I put a solid purple, then if my eyes are directly perpendicular to the very center of the screen, it works fine. As soon as I move up or down, either the top or bottom of the screen becomes very noticeably blue.
But in daily use, I never notice it. If I lean way back in my chair, then yeah, I'll need to adjust my screen to be able to see it.
But this is a 144 hz 1440p monitor I got for $400 brand new in 2015. Pixel response times are great. The monitor works exceptionally well on all the Blur busters tests. It is an amazing monitor for gaming...
...except in dark scenes. It's a TN panel, which by default kind of lacks in contrast and brightness, and so to make it look good, I had to tweak contrast, gamma, and brightness settings, and it results in some clipping. #020202 and #010101 look like they get rounded down to #000000, and #050505 and #040404 look like they're getting rounded down to #030303.
If I draw a pure black-to-white gradient, then there's noticeable banding. Like colors are only being represented in 7 bits per channel, and the darkest colors lose even more.
But again, in daily usage, especially in games (as long as it's not a dark scene) and videos, it's not even noticeable.
But I just recently purchased a Sony A95K QD-OLED television, and holy cow the uniformity is just breathtaking. You start noticing the deficiencies in your own vision.
There's a similar panel available as a computer monitor, but unfortunately only curved and 1440p.
I've probably owned something like 15 monitors in the last 5 years, the XDR may not live up to the 25k reference monitor dreams, but no mere mortal would be able to drive one anyways.
Most people will tell you that their monitor is flawless, then you do simple tests like that one and the monitor shows that it has severe issues and they respond "uh I never noticed, well I don't care". Which is precisely why manufacturers can get away with the shit that they sell.
I've owned pretty much every "notable" monitor in the formats I care about in the last few years, I'm sure I'm pickier than you.
The fact is you can pay for a good enough monitor to truly be flawless, it just costs more than people are envisioning. For example, my late revision 5K Ultrafine nearly as flawless as the XDR. I didn't list it because people who don't know better latch onto the wifi teething issues the first revisions had, but the panel is approaching the limit of little backlight bleed as the technology allows (and the limits are not as poor as people are making out).
-
Honestly I've seen the opposite though, people who don't realize that any piece of screen large enough, photographed with exposure cranked way below normal will show _some_ sort of pattern and confuse _that_ with "terrible backlight bleed".
But that's the panel equivalent of people who only watch Star Wars space sequences with brightness cranked to 11 in a pitch black room to judge HDR bloom...
But it's a damn shame that you can't throw enough money at manufacturers to make them make monitors without glaring QA problems. No matter how much you spend they sell you shit.