Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Hi-Dpi displays are the norm these days for third party

Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.

I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.



It's taken a while party because of cost and perhaps a bit more because of the horrible ways Windows and Linux deal with HiDPI. You wouldn't want heterogenous DPIs or non-integer scales on those platforms. On Linux it seems heterogenous DPI is still very experimental and ugly. On Windows some apps are buggy and others are ugly when dealing with heterogenous DPI. On Windows non integer scale does actually work, but it makes some apps size things horrifically. Needless to say Microsoft's multiple approaches to DPI scaling have made a mess, and Linux never really had a unified way of dealing with it.

If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.


I'd kill for a 4K 100 Hz MacBook display in the current 15 inch form factor.


Heterogeneous DPI works fine on GNOME3/Wayland these days. I had a 4K laptop running at 150% (it was a tiny screen) and it was perfect.


The only HiDPI monitor I'm willing to use is $2000. I consider 60hz to be unacceptable for work.


>I consider 60hz to be unacceptable for work.

Well, the vast majority of the people consider 60hz totally acceptable for work - and work fine with it.

For the huge majority of the people refresh ration > 60hz isn't even a concern.

Better resolution on the other hand is a marked improvement (either as more screen real estate or as finer detail retina style).


You're absolutely right, but I'd wager it's mostly because they haven't been exposed to 120Hz yet. The moment Apple introduces a 120Hz screen on their iPhones, people are going to want it everywhere. Much like HiDPI displays.


They have it on the iPad IIRC.

But isn't that more for quick moving stuff, like games or (in the iPhone) quicker visual response to pen input and such?

For regular computing (reading webpages, editing stuff, programming, watching movies) I don't see much of a difference.

Heck, movies are 24p and we're fine with it.


I absolutely love scrolling on 120Hz displays. It feels so much more natural when the letters aren't blurry as they move under your fingers. Indeed, the iPad Pros have the feature, but they aren't nearly as popular as iPhones. I tried on the Razer Phone, can't wait to have it on mine.


100-200 Hz displays are the next logical step for laptops and mobile phones.

Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.


And even if you got that monitor, would a MacBook Pro be able to drive it at its refresh rate?


These days even mobile phones can drive external monitors at 4k 60 Hz. I think it's reasonable to expect next gen MacBook Pros to be able to drive two 4k monitors at 120 Hz+.


Looks like the 2018 model still only supports 4K at up to 60Hz [1].

[1] https://www.apple.com/lae/macbook-pro/specs/


And the majority of people (and businesses) won't care.

The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.

Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?

Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.

I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).

If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.

However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?


>If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.

True, but not being in search of the best quality doesn't mean that any quality will do.

I've been happy with text rendering on my bog-standard monitor so far. I'm not sure I will still be happy after this degradation.

But as it seems my current Mac Mini is the last of its kind anyway and I'm going to have to move on from Apple.


>I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks

That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.


> Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.

They are not. But I guess the point of the parent poster is that Apple require its customers to spend on latest items instead of supporting old tech.

Basically, it is saying: If you can't afford this, don't buy it. It is fair given that Apple products are already marked up in price.


1x Monitor is only acceptable for gaming in my opinion. I cannot believe people are working on 200$ screens. I mean you can also code on an 80 char terminal in theory, but why would you


16:9 monitor is only acceptable for watching movies in my opinion. I cannot believe people are working on TV screens. I mean you can also code on a Commodore 64 in theory, but why would you?


Why? I find it useful to have two pages or terminals side by side. 16:9 matches human FOV pretty good, which is a reason it's so popular.


More lines of text.

Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.


I know right, can you believe some people pay less than $3,000 for a basic workstation?


I think you're being sarcastic, but I honestly can't. I recently bought a new workstation with a $3k budget, and I regret going with a laptop because the budget wasn't enough for me to get an adequate laptop that will still perform well in 3-5 years and get nice monitors.


I was being sarcastic, but also not taking into account currency conversion rates (and possibly the availability of stuff in your country)

Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.

I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag


Is not that a lot of people have a choice.


Is not that people who don't have a choice spend $2k and $3k on Macs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: