Sony FW900, last, best CRT ever made (that was affordable).
23 inch 16:10 CRT, 1920x1200 - weighs nearly 100 pounds and draws 150 watts if I remember correctly.
Originally like $2000, down to $300 at the end (refurbished).
Had variable phosphorus pitch, denser at the corners and an internal cpu adjusted the corners to correct for (earth?) magnetic field.
Only went to LCD when mine finally died and there was no one who could repair it and getting another was out of the question because shipping prices have gone through the roof.
There is a huge fan thread on them in [H]ardForum with lots of photos.
Absolutely my favorite resolution, and is what's currently setup on my HP ZR24w, as well as deliberately defocussed a couple of notches to fuzz the text up a bit.
In comparison to the $2000 and $10000 crts, this is a great monitor with an ips panel, a standard sRGB color-gamut, and a cheap for an ips panel price of about $380 right now.
Yeah, I'm happy to be programming now, though 6 years ago was a great time for 'if you can haul it, you can have it' deals on crts.
Absolutely my favorite resolution, and is what's currently setup on my HP ZR24w, as well as deliberately defocussed a couple of notches to fuzz the text up a bit.
Actually, the "sharpness" control on any type of monitor applies a deliberate distortion to the image that produces a sensory effect that gives an impression of a sharper image. The image isn't actually sharpened at all. It works much like unsharp masking in photography. Particularly on an LCD monitor with a digital video connection, a sharpness setting of 0 is absolutely the way to go.
I just played with the sharpness control on a HP Compaq consumer 24" display, and the lower sharpness settings seem to actively blur the image. Why would you possibly ever want that?
I believe that if you peer at the individual pixels with a strong magnifying glass, you will see there is no blurring going on.
If you're using a recent version of Windows properly set up for an LCD display, it does use sub-pixel rendering, a resolution-enhancement technique (Microsoft calls it ClearType) that increases the resolution of text. Although it can make text look slightly soft, most people agree that it's also easier to read than text that hasn't been enhanced this way.
If your LCD is anything like mine, the fake "sharpness" it adds can un-do the benefits of the sub-pixel rendering. Maybe that is what you are seeing.
Many displays do indeed do blurring at the lesser end of the sharpness scale, as mseebach notes, and their 'no alteration' point is then normally about 10% from the bottom. It can be easily observed on edges that aren't text, even if text is subpixel rendered.
Yes, exactly, for example if you use 1680x1050 on a MBP 1920x1200 screen, the text takes on a really nice look, similar to using a soft pencil.
I was totally stoked to find the HP can simulate that really nicely by turning the sharpness adjustment down, without needing to back down the displayed resolution.
Definitely interesting to wonder exactly how you defocus an LCD though.
Crazy hot was the bonus of the FW900 - three cats could sleep on top of that beast during the winter - I just put some cardboard on top so their fur wouldn't get in and they'd sleep up there and roast.
Sony had some great CRT monitors. I had two IBM P275's (Sony G520), I ran them at 2048x1536@75hz they worked great for years, the colors/black levels were amazing.
Eventually they started getting washed out (no black), I was able to fix it using a 10-Meg ohm (1/4 watt) resistor.
After a year or so the issue came up again and I decided to get two Lenovo L220x's to replace them.. While the colors are quite good on the Lenovo's (calibrated of course), I miss the resolution and original black levels of the P275's.
I had the same monitor and the same color washout problem 7 years ago. I also did the resistor bridge (using a network of resistors; I didn't have 10MOhm, but a bunch of smaller ones). That afternoon project really tested my metal. Up 'til then I had the a hands of a surgeon when it came to soldering, but all the scary death warnings and the thought of losing a $800 scared the bejeesus out of me.
I had one of those FW900's. Unfortunately the other half got fed up with me lugging immense bits of kit around so I reluctantly gave up. Now I've got a standard 23" 1080p TFT :(
It's really sad how it seems like we're stuck with the 1080p craze for monitors, we've made negative progress in this area, it's becoming really hard to find higher resolutions. Which is weird because normally computer marketing is to absurd levels all about bigger numbers, but the HDTV crap has apparently trained everybody that 1080p is the ultimate in video. Luckily mobile seems to have dodged the HDTV bullet and they're competing on DPI.
I think the 1080p craze is driven by consumer demand. For the average consumer (not a gamer, not a power user), the ultimate visual experience is an HD movie. There is no need for better quality than HD (1080p), because you won't find a source better than that, so the most sensible thing for a hardware provider is to deliver 1080p at the cheapest point possible. Or to improve on other areas than purely pixel count (colour accuracy, brightness, viewing angle...)
It's hard to keep in mind, but the majority of buyers, and therefore the majority of income, may not be like you.
Computer monitors are useful for much more than just full screen video. Even average users, working say a browser side by side with Word, would appreciate the extra real estate. Notice that Apple's monitors are now 2560x1440.
In the consumer's mind, there is no better quality than HD because it doesn't exist. Look at the way marketing speaks to the average consumer. In marketing, perception is reality.
You may know what 1080p means, but most consumers have no clue what the '1080' or the 'p' means, or how it relates to their viewing experience.
I think a lot of is placebo, too. I bet people convince themselves they're enjoying things more because they consciously know they're getting "higher quality". It would be interesting to do a double blind study to see how many consumers could tell the difference between 720p and 1080p.
At the distance from the screen which people generally watch TV, better resolution than HD makes no perceptible difference - it would be pointless and wasteful. At closer distances, such as those involved in using computers, higher resolutions are perceptible and so not irrelevant. Hence they exist for computers and not TVs. Sorry, no reference right now, may add one later
I'm just glad the days of 0.36 dot pitch 15" is over. My first one was part of a "Packard Bell" system. Damn, I'm old.(1) It was so round and so blurry, it was like looking out the portal of a submarine.
I actually worked at one of those little mom and pop computer repair stores that sprung up everywhere in the early 90's. My computer was made of the detritus of Pack-Bell systems that came in not worthy of repair. Their dismal quality worked in my favor. An endless stream of zombie boxes to frankenstein into free stuff for me.
Generally desktop operating systems are not ready for high dpi displays. Mobile was able to move faster in this area, but I expect the next OS generation will facilitate decoupling content size from dpi, particularly for windows.
I'm not sure what you are referring to about monitors. Two of the main players in higher end monitors, Dell and Apple, make their large monitors in higher resolutions than 1080p, such as 2560 x 1440.
I have the Dell U2711 and I love it, but I would be hard-pressed to call either of these monitors costing around $1000 anything but luxury items. Most of the "standard" 21"-24" monitors in the $150-$300 price range are 1080p.
That document show an astounding rate of progress. It's when you read a document like that that you realise just how impossible it is for a team of mediocre developers to compete with a team (or in this case an individual) great developer.
Ugh. Dear intarweb, please stop trying to outsmart safari on iOS. It works just fine as-is. We don't need your fancy-pants JavaScript-based paging implementations.
That's Onswipe, a company founded on the idea that mobile websites should consume as much CPU, RAM, and network resources as possible, because your time isn't valuable, your battery is overcharged, and you have unlimited bandwidth.
An anti-pattern I've noticed recently for "mobile aware" sites is this: on your mobile device you visit a URL for some page on a site, that site recognizes you are using a mobile browser and then completely unhelpfully redirects you to the main-page of the mobile version of the site. Considering that most mobile devices made in the last 3 years or so are fairly capable of producing a decent experience rendering a "desktop" version of a web-page this behavior is decidedly annoying and generally worse than just doing nothing.
There's an atrocious WordPress theme for iPad out there too. The iPhone one is okay though. Apparently some WP installs default to those mobile versions when browsed from iOS, and I'm pretty sure that some blog writers are unaware of it as I can't see for the life of me some of them actually putting that crap down their readers throat.
"I wonder what Carmack uses now? Whatever it is, he could probably have several of them hooked up to a machine each running at 1920 x 1080 and still come nowhere near close to drawing 180 watts."
That's a little optimistic. The 27" and 30" TFTs which are becoming increasingly commonplace consume upwards of 100W, at least at or near full brightness so you'd only need about 2.
I guess when you're talking about the 27" and 30" displays that's the case, but 24" seems to be where the very low power use can be seen. LG has a display it claims only consumes 28 watts, so an array of 6 of those and you'd still have some watts to spare on the InterView.
That's true for the U2410. The new 24" Ultrasharp (U2412) uses an LED backlight and has a typical power draw of 38W. Contrary to popular belief, LED backlighting doesn't lead to a better picture. It is significantly more efficient than CCFL backlighting though.
On a sidenote, and apart from the gains in efficiency, the new Ultrasharp doesn't seem like much of an upgrade, at least at first glance. What a shame/thank god I don't have to buy a new one.
I've got an Acer monitor at work, it claims to draw only 20 watts when in use with the default settings which is to turn the backlight down quite a bit, I turned it to full brightness!
This would bring back some expensive memories for a few on HN I'm sure. I remember, as a multimedia trainee in around 1996-97, buying a 21" NEC CRT for $2,200 second hand. I couldn't give it away today, so it sits in the shed along with a few other CRTs.
Seeing Frank Pritchard's CRT "sculpture" in Deus Ex: Human Revolution certainly made me think...
Speaking of which, what is the highest resolution monitor available today that isn't outrageously expensive? Apple's 2560x1440 Thunderbolt/Cinema display is nice. Any WQUXGA (3840x2400) monitors available like Toshiba's $18000 one[0] but that don't come with a "medical imaging" price tag?
Currently no; waiting for custom LFH-60 <-> Dual-Link DVI cables.
A pair of these cables will allow one Radeon 6750 to power a T221 at full 3840x2400@48Hz resolution as two 1920x2400@48Hz displays in an Eyefinity configuration if you hack some EDID values. I chose the Radeon 6750 because it's the cheapest AMD card that has two dual-link DVI ports.
I chose AMD because Eyefinity supports framelock on a single card under Windows 7, whereas it seems like nVidia Surround only supports framelock across more than one card (which is inconvenient).
Framelock is necessary because Windows 7 doesn't have built-in support for desktop spanning across the two 1920x2400 displays like XP. (I believe this isn't an issue under Linux either). Without framelock support you'd get display tearing in the middle of the full 3840x2400 display.
It's a pity the nVidia route isn't an option: the AMD Catalyst drivers aren't as good as nVidia's (mouse cursor corruption in 2011? WTF AMD...), and the T221 was originally driven by nVidia cards it seems so the you'd think engineering design is there.
Why is a 1080p monitor for 1995 "amazing"? It was quite common for 21" monitors to be 1600x1200 and 1920x1080 isn't a giant leap from that. I picked up a cheap 21" CRT capable of 1600x1200 in the late 90s and I'm no John Carmack.
I think it's because that sounds amazing to average consumers, who were lucky to have 1024x768 on a dot pitch better than 0.28, but why is this on geek.com? Shouldn't most of their readers remember having large, heavy monitors in the 90s? Really makes me wonder what Matthew Humphries (author of that story) was using in 1995.
CRTs weren't like LCDs, the image didn't push off the side if you pushed the resolution too far, you could pretty much push them as far as they could go until you couldn't read it anymore or until it became all vertical lines. Ah, the good ole days...
I had a SGI 1600SW LCD and a Sony FW900 CRT on my desk in 1998. That was a bitchin' desktop setup back then. The SGI required this special graphics card from Number Nine, which ended up going bankrupt, thus ending driver availability. Even by today's standards, the quality of the SGI display was outstanding.
Still, one of the most expenses displays I've seen belonged to a dorm mate of mine in 1993. He had a 20" CRT (Viewsonic maybe?) that was connected with four component cables to a Matrox video card. I'm sure that it would be laughable today but damn, in 1993 that thing was unbelievable.
The glasses are 3D LCD shutter glasses. They sync the frame rate of the shutter to the video output using to the box on top of the monitor. This is a old school version of what the 3D TVs are doing now. We had a whole lab setup with active 3D at work and it was truly amazing. http://en.wikipedia.org/wiki/Cave_Automatic_Virtual_Environm...
In the 1990s it seemed like every day there was some new Virtual Reality device or at least talk of it, people were waiting for their own holodeck but it never arrived VR faded away.
Interesting that geek.com says the picture is from 1995, while the article you've linked here says the monitor was newly announced in May of 1997. I'm guessing geek took some liberties with the details.
EDIT: Not that it matters...just something I noticed.
Not sure about the machine but according to Romero[1] Quake (as with Doom) was developed on NeXTSTEP 3.3, which they continued to use, later running on Intel hardware, until 1996.
Would that have been running NT back then, perhaps the MIPS version?
The environment does look like Visual C of which 1.0 came out somewhere in the 94-95 time-frame if I'm remembering it correctly, though it might have been just a tad earlier.
I bought one of these machines around that time for a design company I was interning at. It's an Intergraph, obviously I have no idea which model. These machines were built to order, so it could have anything in there. That multimedia keyboard brings back memories, prolly the worst $100 I ever spent.
This machine would be running WinNT 3 or 4, most definitely not the MIPS build.
Heck, I spent my first four years programming on a 25*40 screen displayed on a crappy old TV set... When I finally got 80 columns, it seemed like magic.
I had an Intergraph machine like that back in the day. Same giant case, except in blue. I think that one was dual Pentium II, 400MHz, running Windows NT 3.5 or so. Came with a fast array too and a similar keyboard.
They were very expensive. A typical high end CRT monitor might be 19-21", with the ultra high end being 24". 1600x1200 @75Hz refresh was what you wanted, but you usually had to use lower color depth to achieve it.
It's a frame capture from a Doom 3 promotional trailer. In fact, I was the one who captured it and uploaded it to Wikipedia in 2004; it's nice that this article calls it a "classic shot" (that's what I thought too when I saw the scene in the video the first time). With all credit to the actual photographer, of course!
There's lots to learn from Mr Carmack, and other impressive programmers for that matter.
He had/has an amazing talent for producing truly great work at great speed. Is it just magic that he can do this? Or does he have techniques that help him?
It seems obvious, but great tools help a great craftsman. So can great methods. He combined so many techniques from different disciplines.
What tools can a developer use today that propel them above what others are doing? If you're just using a standard issue macbook at this point, your tools are not better than what others are using.
His techniques for focusing on development tasks are also very useful. It boils down to a bubble sorted todo list, constantly refined and structured for high throughput.
I also like his approach to C, and learned a lot about being pragmatic and keeping things simple. For example, his approach to file io. He would read in a whole file at once, rather than reading chunks at a time. This was at a time, when most file readers were mixing file io in with their parsing code - making them slower and more complicated.
His business techniques have always been amazing to watch too. Like how he used blogging in the 90s to gain a wide audience. As well as doing demo software for shareware. Finally his move into space craft after his successful game development career has been inspirational too.
> What tools can a developer use today that propel them above what others are doing? If you're just using a standard issue macbook at this point, your tools are not better than what others are using.
Depending on what you're doing, hardware is mostly a commodity at this point. You need sufficient hardware speed to maintain flow, but that's about it. Otherwise a big monitor (maybe several of them) and good keyboard / mouse / ergonomics are all you need.
Workflows though, I think that's where the magic is. And a great workflow is perfectly suited to the task at hand. The true master programmer optimizes the workflow without descending into wasteful fiddling.
Resources can always help though. An SSD vs a HD, 64GB of ram compared to 1GB, 16 fast CPU cores vs 1 slow core, will all help with development. Things like integration servers and such can help too. A testing rack of 30 mobile devices is another useful thing. If you have 100 times the amount of processing power as a standard macbook pro I can assure you, a good programmer will be able to use it to increase their productivity. Even something like a backup internet line could save 1 day a year. Fast backup and restore tools could save another 1-5 days per year.
There's one script that helps a lot with improving productivity...
His techniques for focusing on development tasks are also very useful. It boils down to a bubble sorted todo list, constantly refined and structured for high throughput.
Do you have a reference for this? I'd love to know more.
23 inch 16:10 CRT, 1920x1200 - weighs nearly 100 pounds and draws 150 watts if I remember correctly.
Originally like $2000, down to $300 at the end (refurbished).
Had variable phosphorus pitch, denser at the corners and an internal cpu adjusted the corners to correct for (earth?) magnetic field.
Only went to LCD when mine finally died and there was no one who could repair it and getting another was out of the question because shipping prices have gone through the roof.
There is a huge fan thread on them in [H]ardForum with lots of photos.
The colors on them are unbelievable.