Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You're misunderstanding what Retina Display is about. Apple does not use the native resolution as the default resolution, it will use the HiDPI resolution (2x) by default.

The font size stays exactly the same except it will be twice as sharp, it will actually look like you got a new pair of eyeglasses.

The native resolution of the panel will be at 4096 x 2304 but the default HiDPI/scaled resolution will be 2048 x 1152 (4096/2 x 2304/2).

People with worse eye vision will actually benefit more from Retina displays.


Thank you for that. I must note that I don't have a retina screen, but noted no discernible difference between 1280 v 1680, in sharpness of fonts (graphics do not matter to me). However, it does beg the point - why pay for twice the pixels than are of use to me?
 
I think Apple upped the resolution so they can actually market it as a 4k iMac without lying. Other 4k monitors are 3.8k.
 
What makes you say that? They didn't use readily available ones for pretty much any other retina product.
While I still consider a 21.5" display some kind of oddball, the readily available 24" panels I saw as fit for a resized entry-level iMac are 4k@3840, not 4k@4096. Thus my assumption has lapsed (shouldn't post when tired late at night *sigh*).
 
Thank you for that. I must note that I don't have a retina screen, but noted no discernible difference between 1280 v 1680, in sharpness of fonts (graphics do not matter to me). However, it does beg the point - why pay for twice the pixels than are of use to me?
My eyesight is not the best as well, but the difference between 1080p and 1080 HiDPI (on a 24" monitor) or 1440p and 1440 HiDPI (15" display) was pretty obvious to me. Font sharpness improves significantly (more black pixels to the char, straight edges) and colors are more intense. I feel it's absolutely worth it!
 
  • Like
Reactions: Benjamin Frost
With the remote, I hope they add a headphone jack similar to the PS4 controller and Roku remote. It would make it a lot easier than needing to run headphones to a home theater when watching/listening to something at night.
 
Thank you for that. I must note that I don't have a retina screen, but noted no discernible difference between 1280 v 1680, in sharpness of fonts (graphics do not matter to me). However, it does beg the point - why pay for twice the pixels than are of use to me?

Retina Display, in addition to having a high resolution panel, also refers to pixel density. There is 4 times more pixels (twice across and twice down) in the same space on the Retina Display compared to a non-Retina display.

Here's an image:
iu


Notice how despite the font is the same size on both pics but the top is much clearer and sharp? That's because there is 4 times more pixels in the same space compared to the bottom display.

That's why you can't see much difference in changing resolution on your non-retina display, you're limited by the pixel density, there is only so much it can draw in the same space, Retina simply can fit more details.

A 4K resolution in 30" display is not the same as 4K resolution in a 24" monitor, the 24" monitor will be far more dense and sharper compared to 30" which can look just as bad as your 1280p monitor at 20" monitor.
 
:apple: neat dot matrix.

If the 6200 is also gonna be in laptops too, then we could "in theory" see 4k Mac laptops yes?. Just a guess.

4k iMac would look good... falls in line with the 5K one they already have..:D not to mention its a smaller number. (and that's always good)

The "talking to your remote" kind of creeps me out..... if your gonna do that wouldn't u rather do it to a real human. ? Robots are the future..
 
What a waste of resolution. 4K doesn't look good on 27, why would it look good on 21 or 24. Jack up the screen size to 36. 4K is just too high of a resolution. I had a 27' 4K dell on my pc and windows and text were just too small.
 
  • Like
Reactions: Benjamin Frost
And? Unless you're legally blind, pretty much anyone would benefit from a retina display.

--Eric
Only the legally uninformed would espouse such marketing nonsense. The real benefit of a "Retina" display is to Apple's bottom line.
 
Video editing (particularly 4k video). If you recall, they made a big deal how 5k allowed editing 4k without scaling while still showing controls and timeline. Also great for photoshop, etc.

And of course it easily handles typical productivity apps.

But anyone who buys any mac expecting good gaming performance is a bit loopy.
Great for Photoshop? The monitor can't even display the entire sRGB gamut.
 
Show me all Nvidia's mobile GPU's that do support 5k resolution!

There...
Point is, the AMD GPUs don't do it either. In fact, any mobile GPU can't handle the resolution. Even desktop class GPUs have difficulties with it. The screens got too dense to fast and the GPUs were slow to catch up. It will be for another generation or 2 before we can expect the same performance results on ultra high resolution as what we see on 1080p and 1440p today.
 
Retina Display, in addition to having a high resolution panel, also refers to pixel density. There is 4 times more pixels (twice across and twice down) in the same space on the Retina Display compared to a non-Retina display.

Here's an image:
iu


Notice how despite the font is the same size on both pics but the top is much clearer and sharp? That's because there is 4 times more pixels in the same space compared to the bottom display.

That's why you can't see much difference in changing resolution on your non-retina display, you're limited by the pixel density, there is only so much it can draw in the same space, Retina simply can fit more details.

A 4K resolution in 30" display is not the same as 4K resolution in a 24" monitor, the 24" monitor will be far more dense and sharper compared to 30" which can look just as bad as your 1280p monitor at 20" monitor.

Thank you very much for the image and explanation - it does look convincing. I may have to check out new Dell monitors in the near future.
 
My eyesight is not the best as well, but the difference between 1080p and 1080 HiDPI (on a 24" monitor) or 1440p and 1440 HiDPI (15" display) was pretty obvious to me. Font sharpness improves significantly (more black pixels to the char, straight edges) and colors are more intense. I feel it's absolutely worth it!


It seems I'm being battered from all sides, on this issue. :) As I mentioned to MikhailT, I will investigate upgrades.
 
What a waste of resolution. 4K doesn't look good on 27, why would it look good on 21 or 24. Jack up the screen size to 36. 4K is just too high of a resolution. I had a 27' 4K dell on my pc and windows and text were just too small.

I'm sorry, but you are wrong. 4k and 5k computer displays are NOT meant to run at native resolutions, but generally scaled at x2.

4k is the new 1080pmjust twice as sharp. Therefore it is perfect at 21" and 24". At 27" everything is way too big already.

5k on the other hand is the new 2560x1440... which is perfect for 27-30".

Just because Windows doesn't have native hiDPI modes (the 125-200% scaling options don't really work so well andars no native hiDPI modes!) does not validate your statement.

4k on 36" like what you which for... would only work at x1 rendering... thus negating ANY improvement to sharpness...
 
So Amd M395X is only 5% faster than M295X? Why you don't put 980M that is 10% faster than the M395X ??
 
More excited about the new remote. Anyone else think it was a clear sign that a new Apple TV box will be announced in the fall alongside Apple Music?
 
Both the CPU and GPU are known to thermal throttle themselves before they overheat and shut down because of Apple's desire to make a thinner desktop. IT'S A DESKTOP YOU WANKERS. IT DOESN'T NEED TO BE THIN OR CARRIED AROUND. PLEASE. STOP IT.

I find it even stupider how harder it is now to insert an SD card, or how the speakers got worse, or how it no longer takes CDs...
Wait, the whole thing is stupid. Damn, I'm sad now. :(
 
  • Like
Reactions: Benjamin Frost
In the same code there were hints at unreleased mobile GPUs from AMD, the R9-M380, M390, M395, M395X, and an Intel Iris Pro 6200 which is already on the market.

el-capitan-amd-chips.png


Why Apple is still doing business with AMD is beyond me. nVidia's GPUs are so much more powerful while operating at a lower TDP, and Apple's terrible cooling design in the iMac isn't helping either. Both the CPU and GPU are known to thermal throttle themselves before they overheat and shut down because of Apple's desire to make a thinner desktop. IT'S A DESKTOP YOU WANKERS. IT DOESN'T NEED TO BE THIN OR CARRIED AROUND. PLEASE. STOP IT.

Do you really need to use language like this? Is your use of English so limited that you must resort to swearing?

Come on moderators kick this person out
 
Point is, the AMD GPUs don't do it either. In fact, any mobile GPU can't handle the resolution. Even desktop class GPUs have difficulties with it. The screens got too dense to fast and the GPUs were slow to catch up. It will be for another generation or 2 before we can expect the same performance results on ultra high resolution as what we see on 1080p and 1440p today.

The M370X in rMBP does, as well as M290, M290X and M295X in riMac. Now, these are of course two different things: that GPU's support 5k and they are sufficient for it. But you can't even display 5k with Nvidia mGPU's. Only the latest Titan desktop monster supports 5k.

"Support for up to 5120-by-2880 resolution at 60Hz on a single external display (model with AMD Radeon R9 M370X only)"
http://www.apple.com/macbook-pro/specs-retina/

UPDATE: Also, why Apple prefers AMD over Nvidia at the moment is that AMD customizes their chips for their customers needs. That could be where Apple / AMD deal is heading at the moment. We will se more products with Apple specific specs. 5k support is just one part of it.
 
Last edited:
Why Apple is still doing business with AMD is beyond me. nVidia's GPUs are so much more powerful while operating at a lower TDP
No, they aren't more efficient. Stop believing everything nvidia says.
They're pretty much the same in DX12 (~10W difference between 980 and 290x/390x), where the GPU is much more utilised and nvidia can't throttle it so much to keep the power town.
Apple's Metal is similar to Vulkan/DX12, so don't expect to see big power differences between amd and nvidia.
 
In the same code there were hints at unreleased mobile GPUs from AMD, the R9-M380, M390, M395, M395X, and an Intel Iris Pro 6200 which is already on the market.

el-capitan-amd-chips.png


Why Apple is still doing business with AMD is beyond me. nVidia's GPUs are so much more powerful while operating at a lower TDP, and Apple's terrible cooling design in the iMac isn't helping either. Both the CPU and GPU are known to thermal throttle themselves before they overheat and shut down because of Apple's desire to make a thinner desktop. IT'S A DESKTOP YOU WANKERS. IT DOESN'T NEED TO BE THIN OR CARRIED AROUND. PLEASE. STOP IT.

The cooling design is not at fault - it's the choice of hardware.

The cooling system was designed for an Ivy Bridge CPU and the (at top end) Nvidia GTX 680MX . The TDP on that GPU is 122W, and it is very good in that setup (I own one). It's cool and quiet even under heavy load and it doesn't throttle.

The problem seems to be that AMD's GPUs that are advertised as having a 120W (ish) TDP clearly aren't holding to that, since the cooling design didn't change between the transition from Nvidia to AMD and the iMac went from cool, quiet and understated into a throttling blast furnace.

Don't blame the cooling, blame the terrible choice of GPU. I have no idea why they picked AMD (probably for OpenCL, or some limitation Nvidia had with 5k resolution perhaps) but in doing so they really cooked the goose, literally.
 
The cooling design is not at fault - it's the choice of hardware.

The cooling system was designed for an Ivy Bridge CPU and the (at top end) Nvidia GTX 680MX . The TDP on that GPU is 122W, and it is very good in that setup (I own one). It's cool and quiet even under heavy load and it doesn't throttle.

Ivy Bridge CPU's were TDP 77W. Haswell is 88W. So don't blame just the GPU's. With Skylake it'll go down to 65W. That'll give more room for next gen' GPU's. Skylake is due to Sept/Oct. 2015. As is El Capitan.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.