I'll bet these run cooler.
I actually sold my 2.4ghz retina 15" because, for one reason, it ran hot doing nothing. I also own the 13" Retina and it runs cool
So the new base should at least be better than the last version
Anyway, I was waiting like a lot of you, and will now buy another 15"
Interesting...we've got a pair of 2012 15s and just took receipt of the new Haswell 15"....definitely they do NOT get hot. No hotter than a 13"...in fact a bit cooler, I'm sure as they were (2012) using the same iGPU for most casual day to day tasks. My 2012 right now, as I'm typing this is running at a nice, cool 31º no the CPU/28º on the GPU.
All the reviews when the last 15" came out said to wait for the next version because the 650M was too slow for the retina display. Ironically, the Intel Iris Pro is SLOWER than the 650M by a significant margin. Other than to save costs, or if you don't plan to do anything intensive with a $2000 computer, there's no compelling reason to get the Iris Pro.
No they didn't. They mentioned
software refinements will continue to improve UI frame rate---as has already happened and developers have updated their software to support HiDPI displays...as well as coding on 'certain' websites. It's been amazing to me in a year of ownership how MANY software developers have updated their apps for high rez monitoring and displays.
Hell, the Intel 4000 runs my 15" consistently and fluently 95% of the time. It's very rare I need to turn (via gfxCardStatus) my discreet on if it hasn't already chosen to do this on it's own via OSx. Not sure where you're reading your reviews...but a good place to start is Anand.
most likely, you'll need TB 2. Theoretically, it could be possible to use 2 TB 1 ports together, but who knows if any vendor would support it.
TB1 supports 4k, in fact it was demonstrated when the initial 2012 units arrived. As well---editing and playing back 4 1080p streams simultaneously
Alarmists like you are just an utter joke - but keep it up. You're making Apple and the stockholders happy by getting everyone to buy up far more than they need.
Folks that don't get it are the punchline though. Iris Pro/750, it's the way to go. Period. The CPU with the I/P costs the same as the CPU w/iGPU and dGPU from last year...the 2GB 750m, while still the same Kepler architecture as the 650m is a helluva decent card....and will ONLY be called on when necessary. For the type of work the OP is doing, he can EASILY control which card is working via the free app many have discussed, gfxCardStatus. In Windows....maybe not, with multiple monitors---who cares??? You're going to be next to a socket in that case....and these rMBPs are NOT loud! Good Lord...some of the BS in this thread is disheartening to say the least.
'Alarmists'??? Why would you say that? A faster clocked processor, double the RAM, double the SSD and a discreet GPU that isn't a slouch----all with the ONLY OEM currently shipping with the Iris Pro/dGPU option!!! That's a pretty sweet deal for $600 additional. >Agreed, in the past Apple has been asses about pricing upgrades, but this is a significant bump for the extra 6 C Notes. Lots of Bang for the buck----those comparing the iGPU to the dGPU are missing a LOT of information. The computational benefits of the iGPU will come in extremely handy once apps optimize their usage model to take advantage of it's power in conjunction with the CPU
...However, when it comes to demanding or optimized applications (Games, CUDA enabled software, or simply just faster performance utilizing GPU apps that are optimized to nVidia itself....possibly the reason the IrisPro is showing up in Apple as it is with their push for Intel to build a 'better' iGPU all along....from the 3000--->5200, monster gains have been made, but they've not yet closed in on a 2GB DDR5 discrete 750m...that could possibly be clocked higher as the 650m was last year, more in parity with the 660m).
By buying a model with the discreet card, one can expect similar battery life, similar temperatures and better performance than the iGPU/IrisPro model. Most of the time, if you're not playing around in After Effects, utilizing Windows through BootCamp, rendering, transcoding, gaming, et al....you'll be using the iGPU...as the dGPU remains dormant. Doesn't take any more energy, doesn't produce any more heat----no more than the unit with ONLY the iGPU will. If you use the dGPU, the iGPU's power is set back to idle---so it's not a 'doubling' of power and heat.
Again...speaking from experience with BOTH units, including the Intel 4000's ability to easily run by day to day surfing tasks, email, etc. Fluently and brilliantly. We actually do use ours for video and audio recording, editing and many times now---transcoding and rendering which used to be relegated to our Mac Pros----as these rMBPs are now speedsters!
Anyway---I'll jump off the soap box, but there is so much misinformation in this thread it borders preposterous.
J