4K is a buzzword for a technology way too soon to be introduced to the masses. Remember Apple talking about 2003 being clearly the year of HD? That was about the same thing. Too much hype about something purely being on the presentation layer. Content is key.
Well I was happily watching the 2004 Olympics in glorious HD. NBC's OTA feeds for their first couple of HD broadcast games even used a different feed, lots more action and less nonsense blabbing and back stories and commercials.
UHD/4k content exists by the boatload. Any photo you take that is more than like 2-3MP won't even fit on the older monitors at least the UHD screens let you view 8MP at once. Most photos taken these days are a lot more than 2-3MP.
And anything that uses text could sure as heck make use or more PPI. Look at how nasty texts looks on a 1920x1200 24" monitors compared to in a book.
Currently displays don't let you see nearly as much image detail as even 8x10" prints or books let you see.
Looking at my 1920x1200 24" it looks like a grainy mess after using a high PPI tablet for a while.
4K/UHD is LATE in arriving to computer monitors, not early.
----------
Once again the industry trying to sell something useless. 4K monitors are a joke. Unless you sit 1 foot away, you won't notice any difference between a 1080p and a 4K monitor. You want real quality, with the best colors and the ultimate black levels, wait for OLED to come down in price. Give it 2 to 3 years. The 4K monitors will have the same problems that current LED monitors have. Clouding, motion blur, and of course, poor blacks and colors.
A joke? I guess all books and magazines are jokes then. And so are printed images. You do realize that people print 300PPI all the time and only get 13x19" prints out of that and that the eye can tell 540PPI prints from 300PPI prints so those 13x19" aren't even as crisp as they could be.
You don't notice that text in a book or photos in a magazine look way better than on your current monitor even if you have a 1920x1200 24"?
You don't notice that looking at real world objects in your room or looking outside you window that things look a lot sharper?
Did you ever try turning anti-aliasing off and play a game on a 1920x1200 24" screen and notice it looks like hell? It would look perfect if the 1920x1200 was 'too much for a 24" screen' as some like to parade around.
Yeah OLED is awesome. I've been waiting for that for a decade. But that has nothing to do with UHD. UHD is hear now. It seems OLED is still another three years out. Both are awesome though. It's not like one prevents the other. Heck, I await the day we get 8K OLED HDR panels driven at 16bits per channel.
----------
4K is nothing but a marketing gimmick (maybe in 10 years when wall sized displays are sub $20K), and glad to see for the marketers it's working on so many people....
You don't need a 4k display... your living room wall perhaps, a screen that small no.... and nothing on the market or in the pipeline is true 4k anyway.
But when you do slam dunk for these, just know you're only paying for current gen tech and until the bionic eyeball comes out...
Maybe you just need some reading glasses. Or are near sighted or something.
Bionic eyeballs? People have noticed that magazines look way more detailed than computer monitors and that looking at current HDTV sure doesn't look like looking out the window with plain old regular human eyeballs.
You don't think a retina iPad looks infinitely better than the older iPads? That's not marketing gimmick. I avoid tablets until the retina pads arrived since the others were a grainy mess.
----------
I would buy a Dell monitor over an Apple monitor any day....from all monitors i worked on NEC and Dell are by far the best....Apple however always has the most crapy monitors "ever".....if its not the gloss its the yellow tinting or ...etc
I stay far away from apple monitors and could care less about them making a 4K one
I hear the NEC is coming out with a 24" and one or two larger size UHD screens in May. I'm sure they'd have all the nice 14 bit 3D LUTs and so on.
----------
Two of the most important words in this rumor: "anti" and "glare"!!
Now if only Apple would follow the same path, at least as an option, I would indeed include the new "4K thunderbolt/cinema" display on my x-mas wish list.
I'd take Apple glare over the nasty, sparkly, heavy AG of NEC (and many others) any day though. Samsung UltraClearPanel is nice.
----------
While I would prefer to buy an Apple product over a Dell product, I must say that, if the color gamut on both monitors is the same, I will buy the Dell monitor because it has a matte (aka "anti-glare") screen. Apple's display is, by far, the most reflective monitor in computing history due to the glossy glass cover.
However, if the color gamut is the way it is now, i.e., Apple using standard gamut (around 72% NTSC) while Dell is offering wide gamut, I will easily choose Apple. It is horrible to use wide gamut displays for everyday apps (browsers, Office, Quicktime, jpeg viewers, Skype, etc.), because they over-saturate the colors, and all standard gamut emulation modes on wide gamut monitors are awful. I'd rather have a true standard gamut monitor.
Most wide gamut monitors have sRGB emulation modes these days and the really good ones like say NEC PA series and perhaps some of the new Dells with 3D internal LUTs can do perfect sRGB emulations (actually better than any sRGB monitor, since they are almost all a little short or a little off from sRGB in one way or another while the wide gamut monitors with internal 3D LUT can re-program each of the primaries to exactly match sRGB standards and those fancy monitors are super-linear and the color engine keeps saturation tracked perfectly from 0-100% and so on).
You can quickly pop a NEC PA into sRGB gamut with sRGB tone response for web and into sRGB gamut and gamma 2.2 for tv/movies/games and into wide gamut for viewing/editing photos (where stuff like sunsets, flowers, tropical waters, flashy cars, bright clothing, fall foliage, emeralds, golden hour lit scenes, etc. look so much better and more like in real life than on sRGB displays).
And most mid or even lower tier wide gamut monitors have at least passable sRGB emulation modes these days.
Plus MAC color manages the desktop and much of the OS anyway. Windows doesn't so unless you put up a special wide gamut re-mapped wallpaper it glows a bit as do the icons and most UI controls and text but if you use a color-managed web-browers like Firefox the web looks 100% perfect (plus you can see the odd wide gamut image that gets posted, in full glory).
----------
Just be aware that it's got an anti-glare Hard Coating 3H. I had a Dell Ultrasharp U2212HM screen with that coating and it's definitely a Marmite moment. I hated it. Made the whites look extremely grainy. So I sent it back for a Dell S2240M which was perfect and a great match for the screen on my 21.5" iMac.
I would hate to have a great 4K resolution obscured by such an aggressive hard coating. I'd want to see it up close before buying.
UGH is the one used on the older NEC PA241,271,301 monitors? It makes the whites all grainy and sparkly? They didn't at least go to the newer version like on the NEC PA242/272/302, not that that is perfect, on this Dell 2414Q?
----------
There is so much FUD going on here.
All of Dell's current UltraSharp offerings feature a 'soft' anti glare hardcoating. It's produced by 3M, and is the same anti-glare coating currently used by most other monitor manufacturers. Technically, it's a semi-gloss, but it works very well and the image is nice and clear. Any talk of aggressive and grainy anti-glare coating was from 3M's previous batch of coating which was applied to manufacturers' monitors in 2010, 2011, and some of 2012.
Oh good about the coating!
And agree with everything else you said.