Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> most HDR content is too damn dark anyway.

It's not, though -- it's entirely the problem with players. Not just software players but also TV's.

There are lots of TV shows now that are available in 1080p SDR and in 4K HDR. When you play them both on any player or TV, they should have the same brightness in "regular" scenes like two people talking in a room. They're meant to. HDR is only meant to make a handful of elements brighter -- glowing lightsabers, glints of sunlight, explosions. HDR is never meant to make any of the content darker.

Unfortunately, far too many video players and televisions (and projectors) map HDR terribly. The TV's aren't actually meaningfully brighter, but they want to advertise HDR support, so they map the wider HDR brightness range to their limited brightness, so that lightsabers etc. get the full brightness, but everything else is way too dark. This is the disaster that makes everyone think HDR content is too dark.

When the correct thing to do would be to display HDR content at full brightness so it matches SDR for normal scenes, and then things like lightsabers, glints of sunlight, etc. just get blown out to white. To admit the hardware has brightness limitations and should never have been advertised as HDR in the first place.

So the problem isn't with HDR content. The problem is with HDR players, TV's, and projectors. It's mostly on the hardware side, with manufacturers that want to advertise HDR support when that's a lie, because they're not actually producing hardware that is HDR-level bright enough.



Also just to piggy back on this, every television I have ever purchased has been calibrated like HOT, SICK ASS from the factory.

Step ONE every time I buy a TV is to spend a half an hour or so using tons of calibration images to get it dialed in to a sensible setting, and it's shocking how much a lot of them need changed. Almost every TV comes calibrated to be used as a display unit in the store which means contrast is rammed through the roof, brightness is way up, they usually have about 15 different kinds of smoothing or frame interpolation going, "denoisiers," color filtering for some insane reason, it's fucking ridiculous.

This is honestly why most of the time I buy cheap ass off-brand TV's because there's just less bullshit to turn off when the TV has "fewer" features.


It's so true. Honestly I will never understand it.

Even just talking about SDR for the moment, films and TV shows have a lot of work put in so that their color, brightness, sharpness and contrast are just right.

And then modern TV settings apply all sorts of filters that just destroy it.

It's bizarre. I do the same thing you do in terms of calibration and it takes forever. All just to get it to act like a dumb monitor for my Apple TV. I just want it to display the signal and nothing more.

So many people are watching TV and films with weird color, weird contrast, and weird swimmy frame interpolation. And that's before you get to HDR where it's weird dark on top of it all.

It genuinely makes me want there to be either a government regulatory body or else some kind of private-organization seal that guarantees you get a normal-color, normal-contrast, normal-brightness, normal-motion image as filmmakers and TV producers intend. You can have all the other options too, but either make them non-default, or provide a single-click "director's choice high-fidelity (TM)" option to undo all the garbage they add. Or something like that.

Because it's completely out of control.


Even every phone's pre-installed camera app has proto-face-recognition derived smoothing at a very fundamental level.

It's not even possible to turn off in some - you have to install a "dumb" camera app.


> Step ONE every time I buy a TV is to spend a half an hour or so using tons of calibration images to get it dialed in to a sensible setting, and it's shocking how much a lot of them need changed.

Eh. Usually you can flip it to movie + warm(2) + dynamic contrast off + enhancements off (think noise reduction) and you’re 90% there. You can get a smidge better picture with calibration images, but most midrange+ TVs come decently calibrated out of the factory these days. Not really worth the effort unless you go all the way and use a colorimeter.


The problem is finding all of those settings. They're all under different submenus (there's certainly no global "enhancements off" toggle), and even figuring out the "no filter" setting is often non-obvious.

E.g. on my Epson projector, there's a "sharpness" setting that goes from 1 to 10, and IIRC the only way to turn off sharpening is to set it to 3. Because values 1 and 2 wind up applying a blur filter. It's not documented. You have to figure out the "3" value from trial and error with a calibration image.

Similarly you need to figure out whether each of your connected devices is outputting RGB 0-255 or RGB 16-235, and set the toggles for your TV to handle the correct input range.

And so forth. The calibration images aren't for figuring out perfect color accuracy with a colorimeter. They're for basic things like displaying smooth gradients from black to white, and are the blacks or whites being cut off? Is the gamma totally off when compared to a checkerboard? Are saturation gradients displaying normally or does it show the image is over-saturated? Is a series of alternating black and white lines displaying as simple lines, or is there a crunchy halo around them (from sharpening) or blurring?

So it's not "eh" at all. In my experience it takes 15 to 30 minutes, as you hunt around the submenus to try to find all the relevant settings, and try to figure out what each setting even means since the documentation is useless. Like, on a power-saving display, should the main color setting be "Natural" or "Cinema" or "Bright Cinema"? You're going to have to experiment.


Where do you get the reference images, how do you display them on the tv, and to what do you compare them? I guess I should just search how to calibrate a tv.


These are my go-tos: https://www.eizo.be/monitor-test/ Don't cover everything and will be useless for HDR, but they're a good starting point IMO for SDR.


Doesn't your TV have a filmmaker mode which is pre-calibrated?


> HDR is only meant to make a handful of elements brighter -- glowing lightsabers, glints of sunlight, explosions. HDR is never meant to make any of the content darker.

I have to do some math to check if this matters or not, but HDR is able to make dark scenes much better too. SDR video doesn't have nearly enough bit depth to represent anything dark well (it's not even 8-bit) whereas 10bit HDR does. So I would expect to see more dark scenes in properly displayed HDR.

Also, in dynamic HDR you should be able to make a dark scene by encoding it brighter and using the metadata to darken it, though I don't remember if it actually works in practice or if anything does it.


I'm not really convinced by this argument.

Isn't the actual issue that SDR content tends to be mapped so that 0 is the darkest colour the TV can display and 1.0 is the brightest colour the TV can display?

When an HDR signal comes along, then of course the brightest element it can display isn't any brighter than 1.0 was in the SDR image. This has the effect of the scene being overall darker in order to give more headroom for brighter elements to be brighter.

But what is the other option? Have SDR content be dimmer than what the HDR monitor would otherwise be capable of displaying? I doubt you will get as many sales as your competition with that approach because the TV that renders SDR as bright as it can will always look better.

(I do understand that there are some caveats relating to the fact that max brightness is different depending on the amount of the screen covered in brightness, but I don't think it's actually relevant to this argument overall)


The real problem is that 400-600 nits is allowed to be called a “HDR-certified” display.

For “real” HDR you want at least 800 nits, ideally 1000. Otherwise you have to do what is stated upthread, latch max brightness to whatever your maximum nits is and make everything else very dark to keep the relative contrast intact.


I'm not an expert on the matter but Apple went with 1.0 for full SDR brightness, so you have to query for the maximum display brightness and explicitly use RGB values > 1.0 for HDR. This particular approach might not be a good idea for TVs but this kind of mapping seems easy to reason about and just makes a lot of sense to me. And since this does limit brightness for non-HDR-aware apps and content people figured out they can cover the screen with a multiplying overlay to boost the brightness up to the max.


If all ways to play HDR content makes it too dark, it's too dark.


But it's not all. Just a lot of the ways. Too many of them.

That's the point I'm making. You can buy expensive HDR displays that actually display correctly where it's not dark.

They're just expensive, and so most people aren't buying those ones.

So I will repeat, the problem is not with HDR content. It's with displays advertising themselves as HDR when they're simply not.


Maybe people should stop publishing HDR content if it worsens the viewing experience for almost everyone then?


I actually think that probably is the correct approach. It seems like studios are releasing their 4K Blu-rays in HDR by default now, and it is absolutely worsening the viewing experience for a large majority of consumers.

Somehow the transition from SDR to HDR has been messed up in a major way, for which I can't think of any similar equivalent. E.g. there was never any "step backwards" in going from 480p to 1080p or 4K, or from stereo to 5.1, or from h.264 to h.265.

But the rollout of HDR has been bungled catastrophically. The content is technically fine but the hardware side has been a disaster.

If I were the movie studios, I would seriously think about releasing 4K only in SDR until television manufacturers start showing HDR at full brightness again, and release software updates to fix it in existing models.


Maybe "almost everyone" should stop buying it?

Spatial audio is like this too.


Oh yeah spatial audio (at least what iOS calls "spatial audio") is horrible. I disable it everywhere I can, but it has to be disabled in control center per app. Some apps stop playing audio when you open control center, making spatial audio seemingly impossible to disable there. There's not a single instance where I prefer having the audio sound like it's coming from the device.

However, like HDR, it can be difficult to avoid. My laptop screen supports HDR because the laptop is otherwise exactly what I want and it doesn't come in a non-HDR variant. My earbuds support spatial audio because I elected to get AirPods Pro for their other qualities and they don't come in a non-HDR variant.

"Vote with your wallet" isn't really an option with things like this.


I thought that Spatial Audio is different from “audio coming from the device” — the former is better physical separation of sound, the latter adjusts left/right balance when your head turns. And the latter can be disabled globally for AirPods.


I guess the option I'm thinking of is "Spatialize Audio", not "Spatial Audio". My bad.

Anyway I've listened to music mixed for Apple Music with "Spatial Audio" too and it sounds like crap compared to normal stereo mixes IMO so I guess my opinion stands.

Where do you configure "Spatialize Audio" globally? I can't find anything in the Bluetooth settings (where they've hidden every other AirPods config)


It's one of the weights that you put into your purchasing decision. The phone that has all of my desired characteristics is kind of hard to find, and getting harder.

But, the net effect is still in the right direction, maybe.


So, are there any displays that do a decent job? How do you find them without investing a ridiculous amount of time?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: