Look, this argument just doesn't hold water at all. I work with text, coding, etc. all the time. And the argument for a few extra vertical lines of resolution just doesn't work when a WIDER display allows you to have more open at a time.
The more windows I can have displayed side by side is a lot better than being able to pull a window down a little bit more. 16x10 is just an inefficient waste of space.
Yeah. And the funny thing is that since screens went 16x9, computer sales are UP. People want proper widescreen.
Thats true. However, on a 16x10 screen, like the MacBook screen, the image of a 2/35:1 film is as small as it is on a 4x3 TV. Not the case with a proper 16x9 screen.
And guess what? The iMacs are selling better than ever now. Apparently Apple made the right move.
Well, you're in a very very small minority. The rest of the industry has moved to 16x9. They did it virtually overnight. Sales are up. Apple did it on the iMac and sales are up. The only two places you really can't find 16x9 now are on Apple notebooks and Apple "Cinema" displays. But its only a matter of time with them.
I'm sure Apple already has working prototypes.
Again, your argument just doesn't work. I've lost NOTHING in "work" (if working on a computer can even be called "work") by going 16x9 and gained everything when it comes to entertainment. You don't know how many times I've had to explain to people why their widescreen movies look so small on 16x10 displays and how those very same people jump at the opportunity to replace their system with a 16x9 display. And yes these are people who "work" on their computers.
Just like me. 16x10 should have never been introduced to begin with.
Which is ironic because HD DVD used the same copy protection as blu-ray disc. HDCP, AACS. BD+ wasn't even around at that time and its something that doesn't affect anyone.
Which is exactly the same for iTunes HD downloads. So its okay for Apple to enforce the same type of HDCP requirements for HD video but its not okay for blu-ray to do it?
You need to take your own advice and not cherry pick replies.
It's funny how you call me a troll and ignore the fact that Apple enforces the same HDCP requirements for their "high definition" content.
http://www.engadget.com/2008/11/17/apple-itunes-multimedia-throwing-hdcp-flags-on-new-macbook-mac/ http://gizmodo.com/5177075/itunes-hd-movies-wont-play-on-older-non+hdcp-monitors
You know what you need for HDCP compliance? A modern videocard with a chipset manufacturered provided driver (also known as a driver directly from nvidia or AMD), a non-Apple manufactured display from within the last 4 years, and an HDMI or DVI cable. Thats it. Scary stuff, huh?
Well, Dell's RGBLED displays are some of the few that can claim true 100% color reproduction.
With Apple's edge-lit LED LCD displays, they're just giving you a thinner display that uses less power. RGBLED actually has hundreds of LEDs behind the LCD panel that can change color along with the picture being displayed on the LCD. Obviously that enhances the color quality dramatically. But none of Apple's displays use this technology. They're all edge-lit.