Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Retina is just a marketing term used by Apple when some device gets to a certain level of pixel density. Past a certain resolution everything becomes 'Retina' it's not a technology it's a marketing term !
It amazes me that people still don't understand that's just a marketing term.

----------

Not true.
A "Retina'd" screen has the pixel-density which is very high, but has the screen real-estate which is considered "normal".

In the iMac's case:
The Retina 5K: 5120 x 2880 (= 2 x (2560 x 1440) )
Without "Retina" the screen will have a "real-estate" of 5120 x 2880 = very small icons.
It actually uses the same screen real-estate as the normal 27": 2560 x 1440, but 4 x sharper.

That is what the term "Retina" means.
It's a marketing term....
 
History repeats itself. Many of the same arguments came about when the first Retina Display showed up - drains battery too quickly, video cards insufficient to power it, not needed, costs too much, etc.

When you're in tech, the idea is to look into the future and start making preparations for it. If you wait until there is a demand or need, then your company is going to fail pretty damn fast. I'm pretty confident that Apple's engineers have a better idea than the rest of us about the trajectory of tech trends. 5K display in an iMac costs a lot now. But if they don't start popularizing it (or 8K displays), the costs won't ever come down. Nor will the video cards ever be able to handle the resolutions. Nor will the batteries be able to sustain them.
 
Sure 8K is great for comparison tables, but until the technology exists to allow fluid interface interaction, gameplay etc. as applicable, for the most part I'd be happier with 4K and decent frame rates above 100Hz (same goes for movies). From my experience, frame rate gives a much greater impression of immersion than resolution.
 
When you're in tech, the idea is to look into the future and start making preparations for it. If you wait until there is a demand or need, then your company is going to fail pretty damn fast. I'm pretty confident that Apple's engineers have a better idea than the rest of us about the trajectory of tech trends. 5K display in an iMac costs a lot now. But if they don't start popularizing it (or 8K displays), the costs won't ever come down. Nor will the video cards ever be able to handle the resolutions. Nor will the batteries be able to sustain them.

^^This!


Also, would like to point out to everyone here complaining about the (lack of perceived) need for 8K displays: This article is about the Video Electronics Standard Association (VESA) creating a new standard to allow embedded displays of up to 8K resolution. They don't make displays, they just create a standard for display makers to use in their products. The fact that this new standard exists in no way is slowing down progress in area's where it's needed. The real complaints should go to GPU vendors for not innovating faster, and display manufacturers for making power sucking dim displays.
 
Last edited:
Sure 8K is great for comparison tables, but until the technology exists to allow fluid interface interaction, gameplay etc. as applicable, for the most part I'd be happier with 4K and decent frame rates above 100Hz (same goes for movies). From my experience, frame rate gives a much greater impression of immersion than resolution.
So other consumers will drive the tech advancement, and you'll lag behind with today's best technology tomorrow.
 
There's a massive difference between TVs, on the one hand, and computers, on the other.

Resolution's a very different situation for the computer.

Of course but what is the benefit here to the end user?

I don't know of anyone who has complained about their Retina displays looking bad. Seems like resources would be better spent in places other than the race to the top of resolution. Reminds me of the megapixel race.
 
ha

New displays haven't been released by Apple for almost 3 and a half years. Think about that. Three and a half YEARS. Before this dry spell they were updated, on average, a little less than once a year. What the hell.
 
I love me some nice high resolution tech, but it's just meaningless numbers at this point. I can't even see what we could really do with 4k+ on such a small screen. Especially when many more factors are more important at this level (performance, battery life etc)

Higher isn't de-facto better.
 
The Walking Dead is shot on 16mm film for that grainy look we all enjoy and appreciate. Which is just about the equivalent of 2K maximum resolution you get out of that.

Ha ha. I knew there was a reason I picked that film. I was thinking that it wouldn't improve my enjoyment at all of that show if it was a touch sharper. Turns out it basically can't get any sharper.

I also have two viewing option set ups. A 720p projector onto a 10 foot diagonal screen about 10 feet from the couch. Or a 48 inch TV sitting 12 feet from the head of my bed. One device physically can't show better clarity and the other is too small and far away for it to make any difference.

There are limited returns to me for added visual fidelity right now.
 
I want oled instead. 8k is nice... if your screen is 130 inches diagonal and you're sitting within 3 feet of it.
 
Far out !! That's good resolution.

As always, we are getting ahead of ourselves by allot. by at least a few years or so.
 
Except 5K is not for movies, it's for text. Source code, for example.

I've coded/developed massive projects on text consoles with huge letters :), was switching whole screens then (very fast when switching text). Very easy on the eyes. That was in the late 1980s, early 1990s.

I truly don't get the need of current engineers to see all at once getting information overload. No wonder there are so many coding/development errors.
Our brains is not made for this despite people thinking they can transcend their humanity by drinking red bull or similar products :).

My eyesight's so bad (even corrected) that I barely can look at those environments for 1h before getting a splitting headache. Good thing I'm no longer involved at that level or I'd need an eye transplant (or brain transplant to work ;).
 
Except 5K is not for movies, it's for text. Source code, for example.

I completely agree with that. Else, I couldn't justify the fact I love my iPad's (4th gen) screen for the very same reason. As a student, it became my primary tool in class. Not for larger assignments, of course, but for reading and taking notes [Evernote, anyone?] - just because it's so much nicer to read on.

So I agree totally, when we talk screen real estate in a work type enviroment (like the ones you mentioned; add to that, photo and video editing).

My response was adressed to the points raised regarding content consumption and consumer trends in general :)
 
I don't know about you guys but I personally don't like to look at any kind of display that is not 6K or higher, ideally around 8-12k is nice. The 5k iMac is just the bare minimum.. barely passable. I don't know how anybody can stand to use a display that is only 4k.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.