Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually, the "sharpness" control on any type of monitor applies a deliberate distortion to the image that produces a sensory effect that gives an impression of a sharper image. The image isn't actually sharpened at all. It works much like unsharp masking in photography. Particularly on an LCD monitor with a digital video connection, a sharpness setting of 0 is absolutely the way to go.


I just played with the sharpness control on a HP Compaq consumer 24" display, and the lower sharpness settings seem to actively blur the image. Why would you possibly ever want that?


I believe that if you peer at the individual pixels with a strong magnifying glass, you will see there is no blurring going on.

If you're using a recent version of Windows properly set up for an LCD display, it does use sub-pixel rendering, a resolution-enhancement technique (Microsoft calls it ClearType) that increases the resolution of text. Although it can make text look slightly soft, most people agree that it's also easier to read than text that hasn't been enhanced this way.

If your LCD is anything like mine, the fake "sharpness" it adds can un-do the benefits of the sub-pixel rendering. Maybe that is what you are seeing.


Many displays do indeed do blurring at the lesser end of the sharpness scale, as mseebach notes, and their 'no alteration' point is then normally about 10% from the bottom. It can be easily observed on edges that aren't text, even if text is subpixel rendered.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: