Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Does this signify a 21" 4k Thunderbolt display is coming?...................(I'll get my coat!)
 
Hasn't been updated since Sept 2011, and they're still selling it for £900. Crazy.
They are waiting until they have a better-performing 4k panels and in sufficient supply (and possibly the release of skylake so they have processors that have better support for the resolution).

In the same code there were hints at unreleased mobile GPUs from AMD, the R9-M380, M390, M395, M395X, and an Intel Iris Pro 6200 which is already on the market.

el-capitan-amd-chips.png


Why Apple is still doing business with AMD is beyond me. nVidia's GPUs are so much more powerful while operating at a lower TDP, and Apple's terrible cooling design in the iMac isn't helping either. Both the CPU and GPU are known to thermal throttle themselves before they overheat and shut down because of Apple's desire to make a thinner desktop. IT'S A DESKTOP YOU WANKERS. IT DOESN'T NEED TO BE THIN OR CARRIED AROUND. PLEASE. STOP IT.

If I had to guess, it's because Nvidia is aiming for serious money from IP. Nvidia is already in a lawsuit with Samsung over graphics IP used in mobile processors, and they've made it clear that anyone else using ARM is next on the list. Should Nvidia win against Samsung, they'll start going after every other company looking to make Billions on IP licensing. Apple will be a huge target, and they'll be looking to force some borderline ridiculous licensing deal on them.
 
  • Like
Reactions: iMacmatician
The M370X in rMBP does, as well as M290, M290X and M295X in riMac. Now, these are of course two different things: that GPU's support 5k and they are sufficient for it. But you can't even display 5k with Nvidia mGPU's. Only the latest Titan desktop monster supports 5k.

It is not 5K as a single stream that Apple needs. It is a Multiple Stream Transport (MST) 5K that Apple needs. The current 5K displays can't take a single 5K stream... So the Titan X's single stream capability is moot till probably next year (or so). What matters is how cleanly and effectively can display two streams as one virtual screen. AMD has worked on EyeInfinity for a long time. Something like that needs to work smoothly on the Mac graphics stack to get 5K to work.

Internally the 5K iMacs are using a custom cable to transport two DP v1.2 streams to the display controller. Likewise the M370X in rMBP , the 5K iMacs , and the Mac Pro can use two cables to drive a external 5K display.


UPDATE: Also, why Apple prefers AMD over Nvidia at the moment is that AMD customizes their chips for their customers needs. That could be where Apple / AMD deal is heading at the moment. We will se more products with Apple specific specs. 5k support is just one part of it.

Not sure what evidence there is that Apple is getting custom chips. Perhaps binned chips that meet that preform better at Apple's custom clocking rates but changed transistors and augment function units on the chips? Dual cable 5K isn't evidence of that.

Apple does buy the GPU chip itself and creates custom graphics boards (and/or directly incorporate onto the motherboard. ). That is an embedded design work and yes AMD has a bigger focus on that than Nvidia. But this isn't quite the same level as the work they do with the console vendors (Playstation and Xbox ) in term of changing the chip package on the inside.
 
Ivy Bridge CPU's were TDP 77W. Haswell is 88W. So don't blame just the GPU's. With Skylake it'll go down to 65W. That'll give more room for next gen' GPU's. Skylake is due to Sept/Oct. 2015. As is El Capitan.

True, it's not just the GPU, but the Late 2012 iMac with the 680MX had considerable overhead room for overclocking the GPU and keeping it stable and cool without doing any modification, so an 11W increase in CPU TDP shouldn't have been a big deal.

It seems to be entirely an issue with the switch to the AMD GPU which has the same TDP on paper as the 680/780 that it replaced, but just doesn't seem to be delivering that in practice.
 
They are waiting until they have a better-performing 4k panels and in sufficient supply (and possibly the release of skylake so they have processors that have better support for the resolution).

Probably more so more affordable 4K displays. I'm not sure Apple is going to leave any non "Retina" iMacs models left except a super low end "education" model powered by the CPU in a MBA. So yes, Gen 6 (Skylake) CPU graphics to power 4K with no discrete GPU at the normal entry level. But also low enough price that they don't significantly increase ( same or maybe just $100 bump) the current 21.5" model price points.

Two overlapping iMac line ups ( 2-3 non Retina 21.5" and 2-3 Retina 21.5" ) models goes against the grain of Apple's usual sales model. The MBP did it for a while but the volume (and market size ) on iMacs is lower. The non Retina options are probably going to shrink. [ the 5K entry price dropped $500 in about 6 months. The "retina" screen wasn't a huge price increase once got volume ramped why up. If the 21.5" is using the same DPI but just cut to a smaller size ... the volume should ramp again. ] Whether it shrinks to almost zero non Retina in 2015 or 2016 is something that only Apple knows ( as they know the prices of components ), but that is where they are going to. The "education" model is for the very price sensitive so it will likely stay gimped with a "even cheaper" screen, but the line up is going "Retina".
 
The cooling design is not at fault - it's the choice of hardware.

The cooling system was designed for an Ivy Bridge CPU and the (at top end) Nvidia GTX 680MX . The TDP on that GPU is 122W, and it is very good in that setup (I own one). It's cool and quiet even under heavy load and it doesn't throttle.

But it is not the same screen. The 680MX isn't being tasked with driving a 5K screen. Nor are they being tasked with doing Retina scaling. It is in part the cooling design, because it is exactly the same for the last 2-3 years. Apple is using the same fans on the 5K models as on the rest of the line up.

Don't blame the cooling, blame the terrible choice of GPU. I have no idea why they picked AMD (probably for OpenCL, or some limitation Nvidia had with 5k resolution perhaps) but in doing so they really cooked the goose, literally.

Apple cooked its own goose by being cheap and reusing the same design ( going for maximum component reuse across models .... which is systemic across all of the Mac models. ). And it is the same "look no vents" design approach that cooked GPU in MBP laptops when Apple first iterated to the "even slimmer" MBPs.

The 27" iMac tries to pull hot air down to push it out of the box. That is just not really a good practice in general. It is the opposite direction that hot air wants to go. They can make it work but will have to huff and puff harder.
 
I may have to check out new Dell monitors in the near future.
Take a closer look at the P2415 and the U2515H. Both have good reviews. I personally opted for the 2415 eventually.

Keep in mind that you need a machine that properly supports 4k@60Hz, otherwise your experience will suffer (e.g. the current Mac mini only supports 4k@30Hz out of the box). You would also need a DisplayPort (or Thunderbolt) output or HDMI 2.0 for driving a 4k display at 60Hz in full resolution.
 
Take a closer look at the P2415 and the U2515H. Both have good reviews. I personally opted for the 2415 eventually.

Keep in mind that you need a machine that properly supports 4k@60Hz, otherwise your experience will suffer (e.g. the current Mac mini only supports 4k@30Hz out of the box). You would also need a DisplayPort (or Thunderbolt) output or HDMI 2.0 for driving a 4k display at 60Hz in full resolution.

Thank you, Neodym. It's never just the one thing, is it! I shall see what the new mini and MP are like, before deciding whether to spend some serious cash. (current m/c is 2009 Mini 2.26)
 
True, it's not just the GPU, but the Late 2012 iMac with the 680MX had considerable overhead room for overclocking the GPU and keeping it stable and cool without doing any modification, so an 11W increase in CPU TDP shouldn't have been a big deal.

But it seems to be a big issue when going from 100W M290X to 125W M295X. So, when Intel goes from Haswell to Skylake, thermal issues should be won over. Unless Apple installs 150W chip on next version of iMac.
 
I'm 43 and can see every pixel on my 2011 21.5" iMac it's actually quite jarring when you've just been using a Retina 13" Macbook Pro. The same goes for Phones.
Are you suggesting you need a 5K phone?

Be patient, your eyesight will fail unless you're met by an untimely demise.
 
It seems I'm being battered from all sides, on this issue. :) As I mentioned to MikhailT, I will investigate upgrades.
Hey Old Codger, don't feel battered, just go to the Apple Store and have them toggle it on and off, if you don't see a big difference, join the club.

I use High-Res, Color Accurate Monitors everyday, they just don't have Apple Logos on them.
 
  • Like
Reactions: lowendlinux
Thank you very much for the image and explanation - it does look convincing. I may have to check out new Dell monitors in the near future.

In addition to what other said already, please remember that the non-Apple monitors need to be configured properly on Macs (you need at the very least 10.10.3 to enable SST support at 60Hz for monitors that support it).

In addition, you have to make sure it is used at HiDPI mode, meaning, half the native resolution. I believe it will render at native resolution by default and you have to configure it to scale at HiDPI.
 
In addition to what other said already, please remember that the non-Apple monitors need to be configured properly on Macs (you need at the very least 10.10.3 to enable SST support at 60Hz for monitors that support it).

In addition, you have to make sure it is used at HiDPI mode, meaning, half the native resolution. I believe it will render at native resolution by default and you have to configure it to scale at HiDPI.


Thanks, again. Have saved the info so I won't forget it. :)
 
Only the legally uninformed would espouse such marketing nonsense. The real benefit of a "Retina" display is to Apple's bottom line.

As an over-40 person who owns both standard and HDPI displays, I personally guarantee that retina displays are completely worthwhile. It's pretty tough to go back once you get used to retina. If you're blind and can't see what's right in front of you, well, I'm sorry. Most of us are sighted and can see things, though. Instead of engaging in sour grapes, it would probably be worthwhile to see a doctor.

--Eric
 
As an over-40 person who owns both standard and HDPI displays, I personally guarantee that retina displays are completely worthwhile. It's pretty tough to go back once you get used to retina. If you're blind and can't see what's right in front of you, well, I'm sorry. Most of us are sighted and can see things, though. Instead of engaging in sour grapes, it would probably be worthwhile to see a doctor.

--Eric
If it works for you, good, there is no noticeable benefit for me,.

I don't need more pixels, I need accurate color across the Adobe RGB Gamut, which Apple does not offer.
 
Last edited:
Are you suggesting you need a 5K phone?

Be patient, your eyesight will fail unless you're met by an untimely demise.

Actually I'm saying that a Retina (or for the sake of argument a HD equivalent) display is far preferable to a pre Retina display (be it PC or Mobile). I would have thought that this would have been obvious in the way in which I posited my response and furthermore in the context of the discussion.

However if that's causing you some difficulty and you want to continue to post glib responses, be my guest. Happy Trolling.
 
But it is not the same screen. The 680MX isn't being tasked with driving a 5K screen. Nor are they being tasked with doing Retina scaling. It is in part the cooling design, because it is exactly the same for the last 2-3 years. Apple is using the same fans on the 5K models as on the rest of the line up.



Apple cooked its own goose by being cheap and reusing the same design ( going for maximum component reuse across models .... which is systemic across all of the Mac models. ). And it is the same "look no vents" design approach that cooked GPU in MBP laptops when Apple first iterated to the "even slimmer" MBPs.

The 27" iMac tries to pull hot air down to push it out of the box. That is just not really a good practice in general. It is the opposite direction that hot air wants to go. They can make it work but will have to huff and puff harder.

What are you talking about? The iMac draws air in at the bottom and exhausts it through the vent behind the hinge. The air goes bottom to middle and then out. Either way, "up" and "down" in terms of convection are relative terms in a forced airflow system. The movement of heat is *vastly, vastly* dominated by the forced airflow from the fan rather than by pure convective mass flow (which would dominate in a fabless setup). Either way it doesn't matter since the air is drawn in at the bottom and exhausted in the middle. The heatsink is common to the GPU and CPU and is just in front of the exhaust vent in the middle of the machine.

Also it doesn't matter what the 680MX is being asked to drive - you can run the GPU at maximum load for hours at a time without it overheating or throttling the machine, which you cannot do with the AMD card, despite the fact that AMD claims the TDP is the same.

I have done this since I have owned the Late 2012 on all manner of games in both windows and OS X, driving the 680MX at 100% load for hours. It literally cannot be asked to work any harder and the 122W TDP card does not overheat or throttle. The AMD GPU with the same setup, also claiming to be approx 120W simply cannot do that.

It's not the cooling setup, it's the fact that AMD's numbers are not in line with how the card works in reality since if it were 120W TDP then it could run at 100% load for hours (like the 680MX) without throttling the system.
 
But it seems to be a big issue when going from 100W M290X to 125W M295X. So, when Intel goes from Haswell to Skylake, thermal issues should be won over. Unless Apple installs 150W chip on next version of iMac.

I don't believe that the 295X is really 125W. The Nvidia 680MX in the Late 2012 is 122W and it has no problem at 100% load for hours on end. So either the 680MX is a lot lower than 122W or the top AMD card is much higher, or the 11W extra TDP by going from Ivy Bridge to Haswell tipped the system over the edge.
 
Oh please let this signal the return of Front Row. That is the one feature I miss most from my first run with owning a MacBook (2006-2011). I went to PC and missed it and a bunch of other features, and when I came back to  with a MacBook Pro retina, I was disappointed to see that Front Row was completely gone from the OS. If not Front Row, maybe some other sort of media experience (maybe Apple TV Mac App?).
 
Actually I'm saying that a Retina (or for the sake of argument a HD equivalent) display is far preferable to a pre Retina display (be it PC or Mobile). I would have thought that this would have been obvious in the way in which I posited my response and furthermore in the context of the discussion.
However if that's causing you some difficulty and you want to continue to post glib responses, be my guest. Happy Trolling.
Call it glib, but you and I are from two fundamentally different and incompatible demographics.

Your hardware needs would not suit me, and I doubt mine would suit you.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.