Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I looked at 4K TVs recently, and if you look through the advertising they're all 4K at 30hz (ie, totally worthless for viewing anything on) due to the limitations of HDMI 1.4.


Do you consider theatre movies at 24hz "worthless"? Or is there some other difference?


One thing that comes to mind is that unless you can pull them down to 24Hz, it will really suck to watch a 24Hz movie on a 30Hz TV.


Movies rely on persistence of vision (or motion blur) to pull off the 24 FPS. I don't want motion blur in anything I use a computer for aside from games - and only then if implemented properly. I haven't seen a good implementation of that yet.


30hz is just fine for a lot of big Emacs windows and terminals!


I work all day on a 30Hz 4k tv. It works for viewing lots of text buffers.


30 Hz is bad for most PC content, but it should be fine for movies, which are mostly shot and distributed at 24 frames per second.


Mostly, yes, but the trend (set by the Hobbit movies) is heading towards higher frame rates (60 and beyond). I can imagine the connectors and graphics cards are racing to get that kind of performance at those resolutions again.


This is not true. The Hobbit demonstration showed why 24fps is superior for Hollywood storytelling and movies will likely stay that way, possibly for decades.


Some reviewers seem to have that opinion, but many, many people use the high-frame-rate-simulators on their home TVs. I've seen the first two Hobbit movies in 48 fps and wish everything was there or higher. Hell, there are projects like SVP [1] to get video smoothed on your computer. People LIKE higher frame rates.

[1] http://www.svp-team.com/


Many people have the high-frame-rate turned on by default and don't know it. That's not the same as preferring the look.


The Hobbit demonstrated that people have different opinions and react differently to change. I saw The Hobbit in both formats, and I thought that the 60 FPS presentation was vastly superior to the 24 FPS.


One movie does not make a trend.


I mainly play games rather than watching filmed content, which it seems makes my use case a little bit more sensitive than other people's.


Right, but watching a 24hz source on a 30hz screen is pretty bad - the technical term is "judder" if you want to learn more.


That depends on the screen technology. 24p looks great on plasmas. I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content. No idea about this specific TV though.


I don't think you're very familiar with any of thisthis technology.

  24p looks great on plasmas.
We're talking about the drawbacks of a 30hz refresh rate here. There are no 30hz plasma displays. They run at more like 600hz.

  I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content
Again, that's pretty far removed from anything we're talking about.

Yes, consumer LCDs often run at 120hz these days and yes, one reason for this is so that they can display 24p content (120 is a multiple of 24) without judder.

If Dell had managed to make the world's first 5120x2880 LCD monitor that runs at 120mhz or higher you can be sure they'd be trumpeting that fact.


I think the confusion is that I'm assuming that a 30Hz display might also be able to refresh at 24Hz. I am familiar with consumer televisions, but not with this particular TV. But I see no inherent reason why a 30Hz display cannot also refresh at 24Hz, especially when the 30Hz limitation seems to be due to the display connector's bandwidth.


I apologize for my tone! You made a totally reasonable assumption, but LCDs don't work that way.

A 120hz LCD always refreshes at 120hz, essentially[1]. But it can easily display 60hz content - it just display each frame twice in a row. Same with 30hz (display each frame four times in a row) or even 24hz (display each frame five times in a row) content.

But suppose you were, say, trying to display 119hz content on a 120hz monitor. You'd have to display the first 118 frames for one LCD refresh interval, and then display the 118th frame for two intervals, which causes judder. Or you could blend all of the frames together and simulate a 120hz source (this is "pulldown").

  But I see no inherent reason why a 30Hz display cannot also
  refresh at 24Hz, especially when the 30Hz limitation seems to 
  be due to the display connector's bandwidth.
If Dell's 5K monitor has a 120hz refresh rate, it could indeed display 24p content smoothly because, as you say, the display connector's bandwidth would be sufficient.

But Dell's 5K monitor almost certainly is a 60hz panel (else they would have trumpeted the fact that it was 120hz!) and 60hz isn't evenly divisible by 24, so you get judder or have to do pulldown.

_______ [1] nVidia's Gsync is one possible solution http://www.anandtech.com/show/7582/nvidia-gsync-review


For most 2014 TVs, it's not a limitation on the TV because they also support HDMI 2.0, but a limitation on the source.

What I don't understand is why these TVs don't have a DP input so we can use them at 60fps with modern PCs (and already on the market GPUs).


>What I don't understand is why these TVs don't have a DP input so we can use them at 60fps with modern PCs (and already on the market GPUs).

Probably because every TV maker / rebrander and their mother has both an HDMI license and the tooling setup. There's, basically, 0 cost to keep usin gHDMI. Moving to DP needs licensing and retooling.


As far as I know using DP does not require a license except for using the DP logo.


4K at 60hz (ie, totally worthless for viewing anything on)

What's wrong with that? Sounds lovely!


looks like it was a typo, parent was edited


I'd prefer 120 hz, at least for games and UI animations.


Sony's 2014 screens all support 3840x2160/60p, HDMI 2.0. Bit laggy for mouse work though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: