Showing your ignorance here buddy. The Iris 5200 has the same performance as a dGPU, you don't need a chip from nVidia any more. That's the whole point of Iris, it's not like the traditional idea of an iGPU.
Seriously for everyone laughing off integrated graphics and saying gamers will be annoyed etc, you really don't understand what Iris is all about. It can match laptop dGPUs. Hence why Codemasters (who make Grid 2) have been advertising it loads, it can play their games at full pelt.
The benefits of dropping the dGPU will outweigh the cons and negative feedback from users who's expecting more from the GPUs.
Less overhead from maintaining two GPUs, less circuitry required on the motherboard, freeing up space for more battery, more battery life and so on.
I doubt the removal of dGPU would make it thinner but if they maintain the same performance from iGPU, it is possible they won't increase the battery volume and instead, thin out the battery, so that they can make the laptop thinner.
So far, folks are loving the size/efficiency focus that Apple has going for the past few years (MBA very successful due to its profile). So, hopefully, the thinner rMBP will work out as well.
It'a mistake to drop the dGPU. Iris Pro, special version or not, will never be as good... And Intel will never catch up with Nvidia. They have really nice Logan cards coming next. People who want real graphics performance, including decent gaming, will be left behind if they buy MacBook Pro's.
My advice: You're heading off to school - focus like a madman on the subject matter, and not on the machine.
University tuition is astronomically overpriced as it is (thanks to the fact it's subsidized and considered "mandatory"), so use the resources they have on campus.
I always found that those that focused on technology and having "the latest and greatest" - and I was in that group - didn't do nearly as well as those who focused on the core foundation of their field. (I wasn't too bad there either, but I definitely got distracted by always trying to apply the latest technology.)
For your means, any MacBook made in the last 2 years should do fine. If you're in engineering, get a big screen and/or an external monitor (which are dirt cheap). And study, study, study!
likely true. Also, apple can't play amd and nvidia off each other to get the lowest prices. Now, they're bound to intel forever. In a few gens, once they get Apple to rely on intel's GPUs, they can jack up the price to whatever they want.
This is bad news actually. For the 15" flagship retina, we need a real GPU, and this looks like we're not going to get one.
It means that someone at Apple is so concerned with battery life that they are willing to completely sacrifice graphics performance. Yes, Iris is better than the old HD was, but that's like saying a Honda Accord is better than a VW beetle. Neither are a Ferrari.
If they phase out the non-retina models, I'm very curious to see where the retina price point will end up.
Many people need more than 128gb of storage, and I think it would be very painful for most to have to pay $1700 for a 256gb retina (and only 13" !!!) vs. $1200 for a non-retina.
I'd be pretty thrilled if I could get a 13" retina, 256gb ssd msata version for around $1200-$1300 !!! That said, I don't expect this to happen. MAYBE we'll see a $100 price drop for retina, but more than likely it will stay the same.
The future is OpenCL, no matter how hard Nvidia fights it.
It's not made clear that I'm in Britain and my tuition is on a loan which I don't have to pay until I'm on £25k (I think) a year, and even then I pay it back incrementally.
Also, a laptop is *vital* for my subject matter. I'm doing computer science, you see. I need a good laptop which I can code onespecially for the fourth year, in which I will be doing my dissertation. I'm also doing a sandwich course in the third year, so I will be earning ~£25k in that year.
Worth selling my current rmbp 2.6ghz model and upgrading?
How about people wait until Apple announces these things? How accurate is SemiAccurate anyway? Are they known for getting scoops in the past?
No, the Iris Pro is far behind the GT 650. For example, go read Anandtech. I remember you from the forums, where this was already presented to you. I guess you are the ignorant one, buddy.
OK, waiting is good and healthy. I'm just saying that in priority, saving your pennies and focus on subject is way more important than "latest technology". And never EVER get seduced by attractive loan terms - life has a funny way of making those terms mighty ugly when it comes time to pay the piper, especially how the west's economy is going.
Given your field - not too far from mine of "chip design", which has me doing advanced programming all the time - I'd focus on having a bigger hard drive and more display space than the latest processor. When I do things local, I have to use virtual Linux machines a lot. In this regards, having throw-away (or school) servers running Linux, and VNC-ing into them makes a LOT more sense.
Keep things simple, keep things cheap, and focus on the core problem at hand.... Just some honest advice from someone who's often learned the hard way!
Tuition fees were introduced in the UK in '98 by Blair at 1k per year. Tripled by him in 2004 then again by Cameron in 2010. So not wonderfully long, the system still has yet to prove it works well tbh.
Also, a laptop is *vital* for my subject matter. I'm doing computer science, you see. I need a good laptop which I can code onespecially for the fourth year, in which I will be doing my dissertation....